Search code examples
python-2.7sshpexpect

How do I set the column width of a pexpect ssh session?


I am writing a simple python script to connect to a SAN via SSH, run a set of commands. Ultimately each command will be logged to a separate log along with a timestamp, and then exit. This is because the device we are connecting to doesn't support certificate ssh connections, and doesn't have decent logging capabilities on its current firmware revision.

The issue that I seem to be running into is that the SSH session that is created seems to be limited to 78 characters wide. The results generated from each command are significantly wider - 155 characters. This is causing a bunch of funkiness.

First, the results in their current state are significantly more difficult to parse. Second, because the buffer is significantly smaller, the final volume command won't execute properly because the pexpect launched SSH session actually gets prompted to "press any key to continue".

How do I change the column width of the pexpect session?

Here is the current code (it works but is incomplete):

#!/usr/bin/python

import pexpect
import os

PASS='mypassword'
HOST='1.2.3.4'
LOGIN_COMMAND='ssh manage@'+HOST
CTL_COMMAND='show controller-statistics'
VDISK_COMMAND='show vdisk-statistics'
VOL_COMMAND='show volume-statistics'

VDISK_LOG='vdisk.log'
VOLUME_LOG='volume.log'
CONTROLLER_LOG='volume.log'

DATE=os.system('date +%Y%m%d%H%M%S')

child=pexpect.spawn(LOGIN_COMMAND)
child.setecho(True)
child.logfile = open('FetchSan.log','w+')
child.expect('Password: ')
child.sendline(PASS)
child.expect('# ')
child.sendline(CTL_COMMAND)
print child.before
child.expect('# ')
child.sendline(VDISK_COMMAND)
print child.before
child.expect('# ')
print "Sending "+VOL_COMMAND
child.sendline(VOL_COMMAND)
print child.before
child.expect('# ')
child.sendline('exit')
child.expect(pexpect.EOF)
print child.before

The output expected:

# show controller-statistics
Durable ID     CPU Load   Power On Time (Secs)   Bytes per second   IOPS             Number of Reads  Number of Writes Data Read        Data Written     
---------------------------------------------------------------------------------------------------------------------------------------------------------
controller_A   0          45963169               1573.3KB           67               386769785        514179976        6687.8GB         5750.6GB
controller_B   20         45963088               4627.4KB           421              3208370173       587661282        63.9TB           5211.2GB
---------------------------------------------------------------------------------------------------------------------------------------------------------
Success: Command completed successfully.

# show vdisk-statistics     
Name   Serial Number                    Bytes per second   IOPS             Number of Reads  Number of Writes Data Read        Data Written     
------------------------------------------------------------------------------------------------------------------------------------------------
CRS    00c0ff13349e000006d5c44f00000000 0B                 0                45861            26756            3233.0MB         106.2MB
DATA   00c0ff1311f300006dd7c44f00000000 2282.4KB           164              23229435         76509765         5506.7GB         1605.3GB
DATA1  00c0ff1311f3000087d8c44f00000000 2286.5KB           167              23490851         78314374         5519.0GB         1603.8GB
DATA2  00c0ff1311f30000c2f8ce5700000000 0B                 0                26               4                1446.9KB         65.5KB
FRA    00c0ff13349e000001d8c44f00000000 654.8KB            5                3049980          15317236         1187.3GB         1942.1GB
FRA1   00c0ff13349e000007d9c44f00000000 778.7KB            6                3016569          15234734         1179.3GB         1940.4GB
------------------------------------------------------------------------------------------------------------------------------------------------
Success: Command completed successfully.

# show volume-statistics    
Name        Serial Number                    Bytes per second   IOPS             Number of Reads  Number of Writes Data Read        Data Written     
-----------------------------------------------------------------------------------------------------------------------------------------------------
CRS_v001    00c0ff13349e0000fdd6c44f01000000 14.8KB             5                239611146        107147564        1321.1GB         110.5GB          
DATA1_v001  00c0ff1311f30000d0d8c44f01000000 2402.8KB           218              1701488316       336678620        33.9TB           3184.6GB         
DATA2_v001  00c0ff1311f3000040f9ce5701000000 0B                 0                921              15               2273.7KB         2114.0KB         
DATA_v001   00c0ff1311f30000bdd7c44f01000000 2303.4KB           209              1506883611       250984824        30.0TB           2026.6GB         
FRA1_v001   00c0ff13349e00001ed9c44f01000000 709.1KB            28               25123082         161710495        1891.0GB         2230.0GB         
FRA_v001    00c0ff13349e00001fd8c44f01000000 793.0KB            34               122052720        245322281        3475.7GB         3410.0GB         
-----------------------------------------------------------------------------------------------------------------------------------------------------
Success: Command completed successfully.

The output as printed to the terminal (as mentioned, the 3rd command won't execute in its current state):

show controller-statistics
Durable ID     CPU Load   Power On Time (Secs)   Bytes per second   
  IOPS             Number of Reads  Number of Writes Data Read        
  Data Written     
----------------------------------------------------------------------
controller_A   3          45962495               3803.1KB           
  73               386765821        514137947        6687.8GB         
  5748.9GB
controller_B   20         45962413               5000.7KB           
  415              3208317860       587434274        63.9TB           
  5208.8GB
----------------------------------------------------------------------
Success: Command completed successfully.


Sending show volume-statistics
show vdisk-statistics
Name   Serial Number                    Bytes per second   IOPS             
  Number of Reads  Number of Writes Data Read        Data Written     
----------------------------------------------------------------------------
CRS    00c0ff13349e000006d5c44f00000000 0B                 0                
  45861            26756            3233.0MB         106.2MB
DATA   00c0ff1311f300006dd7c44f00000000 2187.2KB           152              
  23220764         76411017         5506.3GB         1604.1GB
DATA1  00c0ff1311f3000087d8c44f00000000 2295.2KB           154              
  23481442         78215540         5518.5GB         1602.6GB
DATA2  00c0ff1311f30000c2f8ce5700000000 0B                 0                
  26               4                1446.9KB         65.5KB
FRA    00c0ff13349e000001d8c44f00000000 1829.3KB           14               
  3049951          15310681         1187.3GB         1941.2GB
FRA1   00c0ff13349e000007d9c44f00000000 1872.8KB           14               
  3016521          15228157         1179.3GB         1939.5GB
----------------------------------------------------------------------------
Success: Command completed successfully.
Traceback (most recent call last):
  File "./fetchSAN.py", line 34, in <module>
    child.expect('# ')
  File "/Library/Python/2.7/site-packages/pexpect-4.2.1-py2.7.egg/pexpect/spawnbase.py", line 321, in expect
    timeout, searchwindowsize, async)
  File "/Library/Python/2.7/site-packages/pexpect-4.2.1-py2.7.egg/pexpect/spawnbase.py", line 345, in expect_list
    return exp.expect_loop(timeout)
  File "/Library/Python/2.7/site-packages/pexpect-4.2.1-py2.7.egg/pexpect/expect.py", line 107, in expect_loop
    return self.timeout(e)
  File "/Library/Python/2.7/site-packages/pexpect-4.2.1-py2.7.egg/pexpect/expect.py", line 70, in timeout
    raise TIMEOUT(msg)
pexpect.exceptions.TIMEOUT: Timeout exceeded.
<pexpect.pty_spawn.spawn object at 0x105333910>
command: /usr/bin/ssh
args: ['/usr/bin/ssh', '[email protected]']
buffer (last 100 chars): '-------------------------------------------------------------\r\nPress any key to continue (Q to quit)'
before (last 100 chars): '-------------------------------------------------------------\r\nPress any key to continue (Q to quit)'
after: <class 'pexpect.exceptions.TIMEOUT'>
match: None
match_index: None
exitstatus: None
flag_eof: False
pid: 19519
child_fd: 5
closed: False
timeout: 30
delimiter: <class 'pexpect.exceptions.EOF'>
logfile: <open file 'FetchSan.log', mode 'w+' at 0x1053321e0>
logfile_read: None
logfile_send: None
maxread: 2000
ignorecase: False
searchwindowsize: None
delaybeforesend: 0.05
delayafterclose: 0.1
delayafterterminate: 0.1
searcher: searcher_re:
    0: re.compile("# ")

And here is what is captured in the log:

Password: mypassword


HP StorageWorks MSA Storage P2000 G3 FC
System Name: Uninitialized Name
System Location:Uninitialized Location
Version:TS230P008
# show controller-statistics
show controller-statistics
Durable ID     CPU Load   Power On Time (Secs)   Bytes per second   
  IOPS             Number of Reads  Number of Writes Data Read        
  Data Written     
----------------------------------------------------------------------
controller_A   3          45962495               3803.1KB           
  73               386765821        514137947        6687.8GB         
  5748.9GB
controller_B   20         45962413               5000.7KB           
  415              3208317860       587434274        63.9TB           
  5208.8GB
----------------------------------------------------------------------
Success: Command completed successfully.

# show vdisk-statistics
show vdisk-statistics
Name   Serial Number                    Bytes per second   IOPS             
  Number of Reads  Number of Writes Data Read        Data Written     
----------------------------------------------------------------------------
CRS    00c0ff13349e000006d5c44f00000000 0B                 0                
  45861            26756            3233.0MB         106.2MB
DATA   00c0ff1311f300006dd7c44f00000000 2187.2KB           152              
  23220764         76411017         5506.3GB         1604.1GB
DATA1  00c0ff1311f3000087d8c44f00000000 2295.2KB           154              
  23481442         78215540         5518.5GB         1602.6GB
DATA2  00c0ff1311f30000c2f8ce5700000000 0B                 0                
  26               4                1446.9KB         65.5KB
FRA    00c0ff13349e000001d8c44f00000000 1829.3KB           14               
  3049951          15310681         1187.3GB         1941.2GB
FRA1   00c0ff13349e000007d9c44f00000000 1872.8KB           14               
  3016521          15228157         1179.3GB         1939.5GB
----------------------------------------------------------------------------
Success: Command completed successfully.

# show volume-statistics
show volume-statistics
Name        Serial Number                    Bytes per second   
  IOPS             Number of Reads  Number of Writes Data Read        
  Data Written     
----------------------------------------------------------------------
CRS_v001    00c0ff13349e0000fdd6c44f01000000 11.7KB             
  5                239609039        107145979        1321.0GB         
  110.5GB          
DATA1_v001  00c0ff1311f30000d0d8c44f01000000 2604.5KB           
  209              1701459941       336563041        33.9TB           
  3183.3GB         
DATA2_v001  00c0ff1311f3000040f9ce5701000000 0B                 
  0                921              15               2273.7KB         
  2114.0KB         
DATA_v001   00c0ff1311f30000bdd7c44f01000000 2382.8KB           
  194              1506859273       250871273        30.0TB           
  2025.4GB         
FRA1_v001   00c0ff13349e00001ed9c44f01000000 1923.5KB           
  31               25123006         161690520        1891.0GB         
  2229.1GB         
FRA_v001    00c0ff13349e00001fd8c44f01000000 2008.5KB           
  37               122050872        245301514        3475.7GB         
  3409.1GB         
----------------------------------------------------------------------
Press any key to continue (Q to quit)% 

Solution

  • As a starting point: According to the manual, that SAN has a command to disable the pager. See the documentation for set cli-parameters pager off. It may be sufficient to execute that command. It may also have a command to set the terminal rows and columns that it uses for formatting output, although I wasn't able to find one.

    Getting to your question: When an ssh client connects to a server and requests an interactive session, it can optionally request a PTY (pseudo-tty) for the server side of the session. When it does that, it informs the server of the lines, columns, and terminal type which the server should use for the TTY. Your SAN may honor PTY requests and use the lines and columns values to format its output. Or it may not.

    The ssh client gets the rows and columns for the PTY request from the TTY for its standard input. This is the PTY which pexpect is using to communicate with ssh. this question discusses how to set the terminal size for a pexpect session. Ssh doesn't honor the LINES or COLUMNS environment variables as far as I can tell, so I doubt that would work. However, calling child.setwinsize() after spawning ssh ought to work:

    child = pexpect.spawn(cmd)
    child.setwinsize(400,400)
    

    If you have trouble with this, you could try setting the terminal size by invoking stty locally before ssh:

    child=pexpect.spawn('stty rows x cols y; ssh user@host')
    

    Finally, you need to make sure that ssh actually requests a PTY for the session. It does this by default in some cases, which should include the way you are running it. But it has a command-line option -tt to force it to allocate a PTY. You could add that option to the ssh command line to make sure:

    child=pexpect.spawn('ssh -tt user@host')
    or
    child=pexpect.spawn('stty rows x cols y; ssh -tt user@host')