[Python] CSV 분석
AI, ML, DL 2019. 1. 20. 12:48 |반응형
Python 기본 라이브러리 csv를 이용해 CSV(comma-separated values) 형식을 분석할 수 있다.
import locale
import csv
lo = locale.getdefaultlocale()
# Tries to determine the default locale settings and returns them as a tuple of
# the form (language code, encoding).
print("Default language code: " + lo[0], "Default encoding: " + lo[1], sep = "\n", end = "\n\n")
filename = "list.csv"
with open(filename, "rt", encoding="euc_kr") as f:
csv_data = f.read()
data = []
rows = csv_data.split("\n")
for row in rows:
if row == "":
continue
cells = row.split(",")
data.append(cells)
for c in data:
print("%-8s %8s" %(c[1], c[2]))
print()
#Python csv library
with open(filename, "at", encoding="euc_kr") as f:
# 'a' - open for writing, appending to the end of the file if it exists
# For binary read-write access, the mode 'w+b' opens and truncates the file to 0 bytes.
# 'r+b' opens the file without truncation.
csv_writer = csv.writer(f, delimiter = ",", quotechar = '"')
csv_writer.writerow(["101", "Math", "4300"])
csv_writer.writerow(["102", "Physics", "4800"])
csv_writer.writerow(["103", "English", "5700"])
# stream position을 바꾸고 싶으면 io module의 seek()을 f.seek(...)처럼 사용 한다.
# seek(offset[, whence])
# Change the stream position to the given byte offset. offset is interpreted relative to the
# position indicated by whence. The default value for whence is SEEK_SET. Values for whence are:
# SEEK_SET or 0 – start of the stream (the default); offset should be zero or positive
# SEEK_CUR or 1 – current stream position; offset may be negative
# SEEK_END or 2 – end of the stream; offset is usually negative
# Return the new absolute position.
with open(filename, "rt", encoding="euc_kr") as f:
csv_reader = csv.reader(f, delimiter = ",", quotechar = '"')
for cells in csv_reader:
if cells == []:
continue
print("%-8s %8s" %(cells[1], cells[2]))
list.csv:
결과:
실행 후 list.csv:
반응형
'AI, ML, DL' 카테고리의 다른 글
| [Scraping] 환율 정보 (0) | 2024.01.02 |
|---|---|
| OCR with Tesseract on Windows - Windows에서 테서랙트 사용하기 (0) | 2020.10.07 |
| [Python] JSON 분석, 추출, 변환 (0) | 2019.01.18 |
| Beautifulsoup XML 분석 (0) | 2019.01.15 |
| [Scraping] Selenium으로 로그인이 필요한 사이트(Yes24) 정보 가져오기 (0) | 2019.01.01 |
