我在SQL数据库中有超过100万行数据。我想把整个数据写到TSV文件中。我正在使用下面的代码来获取前100000行。获取并写入TSV文件需要近20分钟。还有其他方法来加快这一过程吗?
cursor.execute("select top(100000) from dbo.StoreLocations_Repo_V10")
store_details = cursor.fetchall()
store_details_list = [list(elem) for elem in store_details]
df = pd.DataFrame(store_details_list)
with open('result.tsv', 'w', encoding='UTF-8') as f:
df.to_csv(f, header=False, sep='\t')发布于 2018-05-10 10:36:31
你可以这样做:
import pyodbc
import pandas as pd
connection = pyodbc.connect('Driver={SQL Server Native Client 11.0};'
'Server=YourServer;'
'Database=YourDB;'
'Trusted_Connection=yes;')
data = "select top(100000) from dbo.StoreLocations_Repo_V10"
df = pd.read_sql(data, connection)
df.to_csv('result.tsv', header=False, sep='\t')https://stackoverflow.com/questions/50270511
复制相似问题