Search code examples
fortranintel-fortran

String array being nullified when passing


I am having trouble passing a string array. Consider the following example code:

! -- Module to declare variable
module my_data
   implicit none

   ! -- Declare as deferred-length allocatable array
   character(len=:), dimension(:), allocatable :: str_array
end module my_data

! -- Module to call subroutine
module my_subs
   implicit none

contains
subroutine a(str_array)
   character(len=*), dimension(:), intent(IN) :: str_array
   integer :: i, j
   character :: c

   do i=1,size(str_array)
      do j=1,len_trim(str_array(i))
         c = str_array(i)(j:j)

         ! -- Write i, j, character, and int representation
         write(*,*) 'In call: ', i, j, ' "'//c//'", ichar = ', ichar(c)
      enddo
   enddo
end subroutine a
end module my_subs

! -- Main program
program main
   use my_data, only : str_array
   use my_subs, only : a
   implicit none
   integer, parameter :: strlen = 200
   integer :: N, i, j
   character :: c

   ! -- Size of str array
   N = 2

   ! -- Allocate str_array, syntax from https://software.intel.com/en-us/forums/intel-visual-fortran-compiler-for-windows/topic/287349
   allocate(character(strlen) :: str_array(N))

   ! -- Set both to the same string
   str_array = 'abc'

   do i=1,size(str_array)
      do j=1,len_trim(str_array(i))
         c = str_array(i)(j:j)

         ! -- Write i, j, character, and int representation
         write(*,*) 'In main: ', i, j, ' "'//c//'", ichar = ', ichar(c)
      enddo
   enddo

   call a(str_array)
end program main

The string array is declared as an array of assumed-length elements (from the wiki). I allocate and set the values of the string (two elements, both to abc for this example). The main routine outputs full details about the string, then calls a subroutine which also outputs the (hopefully same) full details.

Using PGI, GCC, or Intel 15.0, I get the result I expect:

chaud106@ln0005 [~/Testing] % ifort --version && ifort -check all -warn all main.f90 && ./a.out
ifort (IFORT) 15.0.3 20150407
Copyright (C) 1985-2015 Intel Corporation.  All rights reserved.

 In main:            1           1  "a", ichar =           97
 In main:            1           2  "b", ichar =           98
 In main:            1           3  "c", ichar =           99
 In main:            2           1  "a", ichar =           97
 In main:            2           2  "b", ichar =           98
 In main:            2           3  "c", ichar =           99
 In call:            1           1  "a", ichar =           97
 In call:            1           2  "b", ichar =           98
 In call:            1           3  "c", ichar =           99
 In call:            2           1  "a", ichar =           97
 In call:            2           2  "b", ichar =           98
 In call:            2           3  "c", ichar =           99

However, Intel 18.0 sets the second element of the character array (all 3 characters) to the null character:

chaud106@ln0005 [~/Testing] % ifort --version && ifort -check all -warn all main.f90 && ./a.out
ifort (IFORT) 18.0.0 20170811
Copyright (C) 1985-2017 Intel Corporation.  All rights reserved.

 In main:            1           1  "a", ichar =           97
 In main:            1           2  "b", ichar =           98
 In main:            1           3  "c", ichar =           99
 In main:            2           1  "a", ichar =           97
 In main:            2           2  "b", ichar =           98
 In main:            2           3  "c", ichar =           99
 In call:            1           1  "a", ichar =           97
 In call:            1           2  "b", ichar =           98
 In call:            1           3  "c", ichar =           99
 In call:            2           1  "", ichar =            0
 In call:            2           2  "", ichar =            0
 In call:            2           3  "", ichar =            0

I have several questions related to this behavior:

  1. Why is this occurring? I was thinking it could be related to Intel enforcing lhs-reallocation, but I'm not sure. Adding -assume norealloc_lhs didn't change anything.

  2. What is the correct syntax to pass a string array like this? Could I declare it differently and avoid this problem?

The versions of Intel I have access to on this machine have the following behavior:

  • ifort (IFORT) 15.0.2 20150121 - No nullification
  • ifort (IFORT) 15.0.3 20150407 - No nullification
  • ifort (IFORT) 16.0.3 20160415 - No nullification
  • ifort (IFORT) 17.0.4 20170411 - Nullifies
  • ifort (IFORT) 18.0.0 20170811 - Nullifies

On a different machine, I don't have any latest Intel:

  • ifort (IFORT) 14.0.2 20140120 - No nullification
  • ifort (IFORT) 16.0.0 20150815 - No nullification

Solution

  • Your program works as expected with ifort 18.0.3.

    I haven't tried with lots of previous versions, but I note that 17.0.1 was the point at which Fortran 2003 automatic allocation on intrinsic assignment became the default in that compiler.

    The problematic line appears to be

    str_array = 'abc'
    

    Here, str_array should first be deallocated, because the right-hand side is an expression with different length parameter from the left-hand side. Then it would be allocated as a character of length 3 (length of the right-hand side) and shape [2] as before (the right-hand side is scalar). And that does happen, as can be seen with SIZE(str_array) and LEN(str_array). Something goes a little awry later on when using it as an actual argument, though.

    There are ways to work around this problem with 18.0.1:

    • give the dummy argument the value attribute;
    • str_array = ['abc','abc'] (previous allocation not required);
    • str_array(:) = 'abc' (if you don't want the reallocation).

    Many others likely available, depending on exactly what you need. Upgrade your compiler to the latest version if you can, though.