I've been trying to get a web-located comma-delimited text file read into a Python list. My code for reading a local file works fine, but the data needs to be on a secure web server. I'm stuck using Python 2 (because Apple), and it needs to run on stock hardware, so I can't use any non-standard Python extensions.
A representative web file reads:
#curl -k https://mywebserer.net:8443://valuefile.txt
ValueOne,1
ValueTwo,2
ValueThree,3
ValueFour,4
ValueFive,5
(I've ensured there are no spaces in the actual data.)
Due to the lack of SNI support in Python 2, I'm resorting to curling the file and parsing that output. Here's my latest attempt:
import shlex
import subprocess
cmd = '''curl -k http://ift.tt/2iI1qxa'''
args = shlex.split(cmd)
process = subprocess.Popen(args, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
allRows = list(process.stdout)
# Test what we got
print allRows
print allRows[1][1]
Output:
['ValueOne,1\r\n', 'ValueTwo,2\r\n', 'ValueThree,3\r\n', 'ValueFour,4\r\n', 'ValueFive,5\r\n']
a
So it looks like each character is being placed in its own element of the list. I need to keep the strings intact, so I end up with (in this case) a 2x5 array.
Instead of shlex, I've tried split(), splitlines(), csv.reader(), and readline(). Instead of list() I've tried iter(). Instead of Popen, I've tried check_output. Nothing got me closer than the code above.
I'm aware of using lexer.quotes to better divide up an input stream, but I don't know how to call that from within subprocess.Popen.
What method would achieve my goal?
Aucun commentaire:
Enregistrer un commentaire