Each Answer to this Q is separated by one/two green lines.
I have extracted some data from a file and want to write it to a second file. But my program is returning the error:
sequence item 1: expected string, list found
This appears to be happening because
write() wants a string but it is receiving a list.
So, with respect to this code, how can I convert the list
buffer to a string so that I can save the contents of
file = open('file1.txt','r') file2 = open('file2.txt','w') buffer =  rec = file.readlines() for line in rec : field = line.split() term1 = field buffer.append(term1) term2 = field buffer.append[term2] file2.write(buffer) # <== error file.close() file2.close()
Return a string which is the concatenation of the strings in the iterable iterable. The separator between elements is the string providing this method.
file2.write( str(buffer) )
str(anything) will convert any python object into its string representation. Similar to the output you get if you do
print(anything), but as a string.
NOTE: This probably isn’t what OP wants, as it has no control on how the elements of
buffer are concatenated — it will put
, between each one — but it may be useful to someone else.
buffer=['a','b','c'] obj=str(buffer) obj[1:len(obj)-1]
will give “‘a’,’b’,’c'” as output
import functools file2.write(functools.reduce((lambda x,y:x+y), buffer))
import functools, operator file2.write(functools.reduce(operator.add, buffer))
# it handy if it used for output list list = [1, 2, 3] stringRepr = str(list) # print(stringRepr) # '[1, 2, 3]'
From the official Python Programming FAQ for Python 3.6.4:
What is the most efficient way to concatenate many strings together?
bytesobjects are immutable, therefore concatenating many strings together is inefficient as each concatenation creates a new object. In the general case, the total runtime cost is quadratic in the total string length.
To accumulate many str objects, the recommended idiom is to place them into a list and call
str.join()at the end:
chunks =  for s in my_strings: chunks.append(s) result="".join(chunks)
(another reasonably efficient idiom is to use
To accumulate many bytes objects, the recommended idiom is to extend a
bytearrayobject using in-place concatenation (the
result = bytearray() for b in my_bytes_objects: result += b