Python FTP for Data Mining and Analysis

asos_station_mapper_compressed.gif

Python’s file transfer protocol (FTP) library, called ftplib, is a powerful tool for scraping data off of the internet. For this project, I will be downloading weather data from the Automated Surface Observing System (ASOS) , which can be useful for weather models and forecasts. The ASOS network can also be used to calibrate satellite data, characterize incoming and moving storms, and direct air traffic. The ASOS has a one minute sample interval that is available via the FTP protocol, which we will extensively analyze and visualize using Python.


FTP Basics in Python 3.x

Python 3’s “ftplib” is a fairly simple interface for accessing data via the FTP method. To start, the basic example on Python’s ftplib page suffices as an introduction to the library’s methods. We will change the FTP server from debian.org to ‘ftp.ncdc.noaa.gov’ - where the ASOS data is housed. This is shown below in the code snippet:

from ftplib import FTP

ftp = FTP('ftp.ncdc.noaa.gov') # delineate FTP server as noaa.gov
ftp.login() # anonymous login
ftp.dir() # this should print the directories and files in the FTP directory

The snippet of code should printout the following directory at the “ncdc.noaa.gov” FTP server:

ftp_example_printout_ASOS.png

And we can see what libraries are available for navigation and reading. The libraries specifically important to us is the ‘pub’ library, which stands for public. This library will allow us to navigate through it and see what data is publicly available for FTP download.

If we now change our directory by navigating to ‘pub/data’ we can see all of the available public datasets provided by the National Centers for Environmental Information (NCEI), formerly called National Climatic Data Center (hence the ftp server ncdc.noaa.gov). We can navigate to and print out the public data using the following two simple lines of code:

 ftp.cwd('pub/data')
 ftp.dir()

The printed list will be fairly long, and most of them should be available to navigate and download data from. The folder that we are interested in is called ‘asos-onemin’ - which is the one-minute resolution ASOS data, which provides temperature, humidity, barometric pressure, and much more! We will update our FTP navigation to that folder and then print out the data files present in the ASOS one-minute data folder:

from ftplib import FTP

ftp =  FTP('ftp.ncdc.noaa.gov')
ftp.login()
ftp.cwd('pub/data/asos-onemin')
ftp.retrlines('LIST')
ftp.close()

And finally, the names of the files/directories in a given FTP directory can be saved using the ‘nlst()’ command:

from ftplib import FTP

ftp =  FTP('ftp.ncdc.noaa.gov')
ftp.login()
ftp.cwd('pub/data/asos-onemin')
dirs = ftp.nlst()
[print(str(ii)+' '+jj) for ii,jj in enumerate(dirs)]
ftp.close()

The variable ‘dirs’ now contains all of the directories with our ASOS data. Each folder name is a different one-minute ASOS station product for a given year. The reason we know this is we can look at the readme.txt file in the directory! We can download the non-directory files, which will make understanding the ASOS one-minute files. We can use the following script:

from ftplib import FTP

ftp =  FTP('ftp.ncdc.noaa.gov') # ftp access to ncdc.noaa.gov
ftp.login() # anonymous ftp login
ftp.cwd('pub/data/asos-onemin') # change directory to asos-onemin
dirs = ftp.nlst() # list all files in directory
[print(str(ii)+' '+jj) for ii,jj in enumerate(dirs)] # print dirs/files and their indices
description_files = [ii for ii in dirs if len(ii.split('.'))>1]
for file in description_files:
    with open(file, 'wb') as fp:
        ftp.retrbinary('RETR '+file, fp.write) # save non-directory files (readme, etc.)
ftp.close() # close ftp connection

For the ASOS system, we get four files, one of which is the readme.txt file, and the others are description files. I recommend looking at all of the deposited files for better understanding and reference of how the ASOS network functions and how the data is distributed.


Automated Surface Observing System (ASOS) Stations

Before we dive into the data files themselves, it will be helpful to download the station description file, which is located at the following ftp address:

ftp.ncdc.noaa.gov/pub/data/ASOS_Station_Photos/asos-stations.txt

We first need to navigate to the ASOS_Station_Photos folder via FTP, then we can download the file and sift through it to understand each row and how it pertains to the ASOS network and the respective stations.

As an example of how to sift through the station array, I have included a larger snippet of code that goes through the processes mentioned above, as well as sifts through the asos-station.txt file and pulls the station information. I use the general coordinates of New York City to test whether the program is working. The output of the script below should be a few checks for whether the asos-onemin files were downloaded, whether the asos-stations.txt file was downloaded, and prints out the nearest station to input coordinates:

from ftplib import FTP
import csv,os
import numpy as np

########################################
# FTP connection and directory handling
########################################
#
ftp =  FTP('ftp.ncdc.noaa.gov') # ftp access to ncdc.noaa.gov
ftp.login() # anonymous ftp login
ftp.cwd('pub/data/asos-onemin') # change directory to asos-onemin
dirs = ftp.nlst() # list all files in directory
[print(str(ii)+' '+jj) for ii,jj in enumerate(dirs)] # print dirs/files and their indices
description_files = [ii for ii in dirs if len(ii.split('.'))>1]
for file in description_files:
    if os.path.isfile(file):
        print('already downloaded file: '+file)
        continue
    with open(file, 'wb') as fp:
        ftp.retrbinary('RETR '+file, fp.write) # save non-directory files (readme, etc.)

########################################
# Accessing data files/directories
########################################
#
data_dir_indx = 44 # select index printed out in Python window
ftp.cwd(dirs[data_dir_indx]) # 45 == 6406-2019 [in my particular case]
print('Accessing: {} Folder'.format(dirs[data_dir_indx]))
data_files = ftp.nlst() # list all data files for directory above

########################################
# download ASOS file with all station
# properties (lat/lon/elevation)
########################################
#
ftp.cwd('/pub/data/ASOS_Station_Photos/')
asos_filename = 'asos-stations.txt'
if os.path.isfile(asos_filename):
    print('already downloaded file: '+asos_filename)
else:
    with open(asos_filename, 'wb') as fp:
        ftp.retrbinary('RETR '+asos_filename, fp.write) # save non-directory files (readme, etc.)

station_props,station_header,dash_header = [],[],[]
header_bool,dash_bool = False,False
with open(asos_filename,newline='') as txtfile:
    csvreader = csv.reader(txtfile,delimiter='\t')
    for row in csvreader:
        if len(row)<1:
            continue
        if row[0][0:6]=='NCDCID':
            station_header = row[0]
            header_bool = True
        elif header_bool:
            if dash_bool:
                props = []
                props_iter = 0
                for qq in dash_header:
                    props.append(row[0][props_iter:props_iter+len(qq)+1])
                    props_iter+=len(qq)+1
                station_props.append(props)
            else:
                dash_header = (row[0]).split(' ')
                dash_bool = True
                props_iter = 0
                props = []
                for rr in dash_header:
                    props.append(station_header[props_iter:props_iter+len(rr)+1])
                    props_iter+=len(rr)+1
                station_header = props

# now we have two variables which will help us characterize each
# ground station:
# - station_header - 
#    __this variable contains the header for each station and what each row means
# - station_props -
#    __this variable contains the specific station information (for 900+ stations in USA)


########################################
# Finding and interpreting real station data
########################################
#
# if we use a geographic coordinate, we can find
# a specific station that's nearest to that location

my_lat,my_lon = 40.7128, -74.0060 # NYC coordinates

station_lats = [float(ii[9]) for ii in station_props]
station_lons = [float(ii[10]) for ii in station_props]

# find the station lat/lon nearest to the input my_lat/my_lon
nearest_indx = np.argmin(np.abs(np.subtract(station_lats,my_lat))+np.abs(np.subtract(station_lons,my_lon)))
print('-----------')
print('The nearest station to {},{} is {} (ID: {}) at {},{}'.\
      format(my_lat,my_lon,(station_props[nearest_indx][4]).replace('  ',''),
             'K'+station_props[nearest_indx][3].replace(' ',''),station_lats[nearest_indx],
             station_lons[nearest_indx]))

ftp.close() # close ftp connection

If everything worked as expected, the last line should read something like:

-----------

The nearest station to 40.7128,-74.006 is NEW YORK CNTRL PK TWR (ID: KNYC) at 40.77889,-73.96917

which states that the nearest station to the center of NYC is the KNYC station, which is the expected answer! If we were to input another pair of coordinates, say, for Los Angeles, we would get the printout:

-----------

The nearest station to 34.0522,-118.2437 is LOS ANGELES DWTN USC CAMPUS (ID: KCQT) at 34.0236,-118.2911

which also is as expected, since the University of Southern California campus station is likely the nearest to the central coordinates of Los Angeles.

We can plot the station latitudes and longitudes from the entire station database, and get an idea of just how many stations there are - just in the continental U.S.:

asos_station_spatial_map_compressed.png

Now that we’ve identified stations based on coordinates, we can identify the station IDs and use those to extract data from the asos-onemin data folder!


ASOS Data Files and Identifiers

Now that we have a way of identifying the stations and their properties, we can look at how the data files are saved in the asos-onemin folder. By looking at the very first file in the ‘data_files’ variable produced by the script above, we can see the format of the station file:

6405 0 K1J0 2019 01 .dat

This format tells us a few things:

  1. 6405 is the data product type

  2. 0 is a space holder

  3. K1J0 is the station identifier

  4. 2019 is the year

  5. 01 is the month

  6. .dat is the data file type

We can use the file types to search for our station of interest. The station identifier can be found using our code above, which is found in the asos-stations.txt file. Using geographic coordinates of interest, we can find our nearest station, use the identifier to find the station data, then download those files relevant to our geographic coordinates. This is done in the script below.

from ftplib import FTP
import csv,os,datetime
import numpy as np
import matplotlib.pyplot as plt

plt.style.use('ggplot')

########################################
# FTP connection and directory handling
########################################
#
ftp =  FTP('ftp.ncdc.noaa.gov') # ftp access to ncdc.noaa.gov
ftp.login() # anonymous ftp login
ftp.cwd('pub/data/asos-onemin') # change directory to asos-onemin
dirs = ftp.nlst() # list all files in directory
# uncomment below to print out all files/folders in asos-onemin dir
##[print(str(ii)+' '+jj) for ii,jj in enumerate(dirs)] # print dirs/files and their indices
description_files = [ii for ii in dirs if len(ii.split('.'))>1]
for file in description_files:
    if os.path.isfile(file):
        print('already downloaded file: '+file)
        continue
    with open(file, 'wb') as fp:
        ftp.retrbinary('RETR '+file, fp.write) # save non-directory files (readme, etc.)

########################################
# Accessing data files/directories
########################################
#
data_dir_indx = 45 # select index printed out in Python window
ftp.cwd(dirs[data_dir_indx]) # 45 == 6406-2019 [in my particular case]
print('Accessing: {} Folder'.format(dirs[data_dir_indx]))
data_files = ftp.nlst() # list all data files for directory above

########################################
# download ASOS file with all station
# properties (lat/lon/elevation)
########################################
#
ftp.cwd('/pub/data/ASOS_Station_Photos/')
asos_filename = 'asos-stations.txt'
if os.path.isfile(asos_filename):
    print('already downloaded file: '+asos_filename)
else:
    with open(asos_filename, 'wb') as fp:
        ftp.retrbinary('RETR '+asos_filename, fp.write) # save non-directory files (readme, etc.)

station_props,station_header,dash_header = [],[],[]
header_bool,dash_bool = False,False
with open(asos_filename,newline='') as txtfile:
    csvreader = csv.reader(txtfile,delimiter='\t')
    for row in csvreader:
        if len(row)<1:
            continue
        if row[0][0:6]=='NCDCID':
            station_header = row[0]
            header_bool = True
        elif header_bool:
            if dash_bool:
                props = []
                props_iter = 0
                for qq in dash_header:
                    props.append(row[0][props_iter:props_iter+len(qq)+1])
                    props_iter+=len(qq)+1
                station_props.append(props)
            else:
                dash_header = (row[0]).split(' ')
                dash_bool = True
                props_iter = 0
                props = []
                for rr in dash_header:
                    props.append(station_header[props_iter:props_iter+len(rr)+1])
                    props_iter+=len(rr)+1
                station_header = props

# now we have two variables which will help us characterize each
# ground station:
# - station_header - 
#    __this variable contains the header for each station and what each row means
# - station_props -
#    __this variable contains the specific station information (for 900+ stations in USA)


########################################
# Finding a station by coordinates
########################################
#
# if we use a geographic coordinate, we can find
# a specific station that's nearest to that location

##my_lat,my_lon = 34.0522,-118.2437 # LA coordinates
my_lat,my_lon = 40.7128,-74.0060  # NYC coordinates
##my_lat,my_lon = 41.8781,-87.6298  # Chicago coordinates
##my_lat,my_lon = 21.3069,-157.8583 # Honolulu coordinates

station_lats = [float(ii[9]) for ii in station_props]
station_lons = [float(ii[10]) for ii in station_props]

# find the station lat/lon nearest to the input my_lat/my_lon
nearest_indx = np.argmin(np.abs(np.subtract(station_lats,my_lat))+np.abs(np.subtract(station_lons,my_lon)))
print('-----------')
print('The nearest station to {},{} is {} (ID: {}) at {},{}'.\
      format(my_lat,my_lon,(station_props[nearest_indx][4]).replace('  ',''),
             'K'+station_props[nearest_indx][3].replace(' ',''),station_lats[nearest_indx],
             station_lons[nearest_indx]))

########################################
# finding station data and saving it locally
########################################
#
ftp.cwd('../../data/asos-onemin/'+dirs[data_dir_indx]) # change directory back to asos-onemin
data_folder = './data/' # data where files will be saved (will be created)
if os.path.isdir(data_folder)==False:
    os.mkdir(data_folder)

sel_data_files = []
for ss in data_files:
    if ss[6:9]==station_props[nearest_indx][3].replace(' ','') and ss.endswith('.dat'):
        sel_data_files.append(ss)

for file_ii in sel_data_files:
    if os.path.isfile(data_folder+file_ii):
        continue
    with open(data_folder+file_ii, 'wb') as fp:
        ftp.retrbinary('RETR '+file_ii, fp.write) # save data files

ftp.close() # close ftp connection

After running the script above, the nearest station and its data will be downloaded and added to a folder in the local directory, called ‘/data/’ - which is where we will subsequently read and parse real data from.


Visualizing and Parsing ASOS Data

Now that we have the data files stored locally, we can begin to parse and visualize the data. What we can do in this case is open the .dat files using Python’s csv reader, and read in the files as fixed-width files, and ultimately separate the data based on the fixed width of the data rows.

This is done below in the given code. The additional lines are in continuation of the code above, so look for the added lines as guidance for how the parsing is taking place.

from ftplib import FTP
import csv,os,datetime
import numpy as np
import matplotlib.pyplot as plt

plt.style.use('ggplot')

########################################
# FTP connection and directory handling
########################################
#
ftp =  FTP('ftp.ncdc.noaa.gov') # ftp access to ncdc.noaa.gov
ftp.login() # anonymous ftp login
ftp.cwd('pub/data/asos-onemin') # change directory to asos-onemin
dirs = ftp.nlst() # list all files in directory
# uncomment below to print out all files/folders in asos-onemin dir
##[print(str(ii)+' '+jj) for ii,jj in enumerate(dirs)] # print dirs/files and their indices
description_files = [ii for ii in dirs if len(ii.split('.'))>1]
for file in description_files:
    if os.path.isfile(file):
        print('already downloaded file: '+file)
        continue
    with open(file, 'wb') as fp:
        ftp.retrbinary('RETR '+file, fp.write) # save non-directory files (readme, etc.)

########################################
# Accessing data files/directories
########################################
#
data_dir_indx = 45 # select index printed out in Python window
ftp.cwd(dirs[data_dir_indx]) # 45 == 6406-2019 [in my particular case]
print('Accessing: {} Folder'.format(dirs[data_dir_indx]))
data_files = ftp.nlst() # list all data files for directory above

########################################
# download ASOS file with all station
# properties (lat/lon/elevation)
########################################
#
ftp.cwd('/pub/data/ASOS_Station_Photos/')
asos_filename = 'asos-stations.txt'
if os.path.isfile(asos_filename):
    print('already downloaded file: '+asos_filename)
else:
    with open(asos_filename, 'wb') as fp:
        ftp.retrbinary('RETR '+asos_filename, fp.write) # save non-directory files (readme, etc.)

station_props,station_header,dash_header = [],[],[]
header_bool,dash_bool = False,False
with open(asos_filename,newline='') as txtfile:
    csvreader = csv.reader(txtfile,delimiter='\t')
    for row in csvreader:
        if len(row)<1:
            continue
        if row[0][0:6]=='NCDCID':
            station_header = row[0]
            header_bool = True
        elif header_bool:
            if dash_bool:
                props = []
                props_iter = 0
                for qq in dash_header:
                    props.append(row[0][props_iter:props_iter+len(qq)+1])
                    props_iter+=len(qq)+1
                station_props.append(props)
            else:
                dash_header = (row[0]).split(' ')
                dash_bool = True
                props_iter = 0
                props = []
                for rr in dash_header:
                    props.append(station_header[props_iter:props_iter+len(rr)+1])
                    props_iter+=len(rr)+1
                station_header = props

# now we have two variables which will help us characterize each
# ground station:
# - station_header - 
#    __this variable contains the header for each station and what each row means
# - station_props -
#    __this variable contains the specific station information (for 900+ stations in USA)


########################################
# Finding a station by coordinates
########################################
#
# if we use a geographic coordinate, we can find
# a specific station that's nearest to that location

my_lat,my_lon = 34.0522,-118.2437 # LA coordinates
##my_lat,my_lon = 40.7128,-74.0060  # NYC coordinates
##my_lat,my_lon = 41.8781,-87.6298  # Chicago coordinates
##my_lat,my_lon = 21.3069,-157.8583 # Honolulu coordinates

station_lats = [float(ii[9]) for ii in station_props]
station_lons = [float(ii[10]) for ii in station_props]

# find the station lat/lon nearest to the input my_lat/my_lon
nearest_indx = np.argmin(np.abs(np.subtract(station_lats,my_lat))+np.abs(np.subtract(station_lons,my_lon)))
print('-----------')
print('The nearest station to {},{} is {} ({}) at {},{}'.\
      format(my_lat,my_lon,(station_props[nearest_indx][4]).replace('  ',''),
             'K'+station_props[nearest_indx][3].replace(' ',''),station_lats[nearest_indx],
             station_lons[nearest_indx]))

########################################
# finding station data and saving it locally
########################################
#
ftp.cwd('../../data/asos-onemin/'+dirs[data_dir_indx]) # change directory back to asos-onemin
data_folder = './data/' # data where files will be saved (will be created)
if os.path.isdir(data_folder)==False:
    os.mkdir(data_folder)

sel_data_files = []
for ss in data_files:
    if ss[6:9]==station_props[nearest_indx][3].replace(' ','') and ss.endswith('.dat'):
        sel_data_files.append(ss)

for file_ii in sel_data_files:
    if os.path.isfile(data_folder+file_ii):
        continue
    with open(data_folder+file_ii, 'wb') as fp:
        ftp.retrbinary('RETR '+file_ii, fp.write) # save data files

########################################
# parsing station data and visualizing it
########################################
#
file_indx = 8
file_ii = np.sort(sel_data_files)[file_indx]
data_ii = []
with open(data_folder+file_ii, newline='') as dat_file:
    csvreader = csv.reader(dat_file)
    for row in csvreader:
        row = row[0]
        if dirs[data_dir_indx].split('-')[0]=='6405':
            data_ii.append([row[0:10],row[10:33],row[33:59],row[59:70],row[70:75],
                        row[75:80],row[80:85],row[85:90],row[90:]])
        elif dirs[data_dir_indx].split('-')[0]=='6406':
            data_ii.append([row[0:10],row[10:32],row[32:44],row[44:62],row[62:70],
                            row[70:76],row[76:86],row[86:95],row[95:99],row[99:]])

####
# the data is given in data_ii as follows (for 6405):
# row 0 - identifier for station
# row 1 - identifier + timestamp
# row 2 - visibility (N = night; D = day)
# row 3 - visibility (N = night; D = day)
# row 4 - wind direction (2-min avg)
# row 5 - wind speed     (2-min avg)
# row 6 - dir of max wind speed (5-sec)
# row 7 - speed of max wind dir (5-sec)
# row 8 - runway visual range (hundreds ft)


####
# the data is given in data_ii as follows (for 6406):
# row 0 - identifier for station
# row 1 - identifier + timestamp
# row 2 - precipitation (NP=none, S = snow, R = rain)
# row 3 - amount of precip
# row 4 - frozen precip sensor frequency
# row 5 - pressure 1
# row 6 - pressure 2
# row 7 - pressure 3
# row 8 - Avg 1min dry bulb temp
# row 9 - Avg 1min dew pt temp

########################################
# PLOTTING THE DATA
########################################
#
# Let's look at one of the pressures for a given station:
t_strs,pres,temp_dry,temp_wet = np.array([]),np.array([]),np.array([]),np.array([])
for dats in data_ii:
    try:
        t_ii = (dats[1][3:].replace(' ',''))
        pres_ii = float(dats[5])      
        temp_dry_ii = float(dats[8])
        temp_wet_ii = float(dats[9])
        t_strs = np.append(t_strs,t_ii)
        pres = np.append(pres,pres_ii)
        temp_dry = np.append(temp_dry,temp_dry_ii)
        temp_wet = np.append(temp_wet,temp_wet_ii)
    except:
        pass

t_vec = [datetime.datetime.strptime(ii[0:-4],'%Y%m%d%H%M') for ii in t_strs]

# relative humidity calculation:
# equation taken from:
# https://maxwellsci.com/print/rjaset/v6-2984-2987.pdf
#
delta_T = np.subtract(temp_dry,temp_wet) # dry - wet temps
A = 0.00066*(1.0+(0.00115*temp_wet)) # empirical relationship
P = pres*33.8639 # pressure from inHg to mb
e_w = 6.112*np.exp((17.502*temp_wet)/(240.97+temp_wet)) # sat. vapor pressure for wet-bulb temp
e_d = 6.112*np.exp((17.502*temp_dry)/(240.97+temp_dry)) # sat. vapor pressure for dry-bulb temp 
RH = ((e_w-(A*P*delta_T))/e_d)*100.0


fig,axs = plt.subplots(3,1,figsize=(10,6))

cmap = plt.cm.Set1

ax = axs[0]
ax.plot(t_vec,pres,color=cmap(0))
ax2 = axs[1]
ax2.plot(t_vec,temp_dry,color=cmap(1),label='Dry-Bulb')
ax2.plot(t_vec,temp_wet,color=cmap(2),label='Wet-Bulb')

ax3 = axs[2]
ax3.plot(t_vec,RH,color=cmap(3))

ax2.set_xlabel('Local Time',fontsize=12)
ax.set_ylabel('Pressure [inHg]',fontsize=12)
ax2.set_ylabel('Temperature [F]',fontsize=12)
ax3.set_ylabel('Humidity [%]',fontsize=14)

ax.get_yaxis().set_label_coords(-0.07,0.5)
ax2.get_yaxis().set_label_coords(-0.07,0.5)
ax3.get_yaxis().set_label_coords(-0.07,0.5)

ax.set_xticks([])
ax2.set_xticks([])
ax3.tick_params(axis='x', rotation=15)

ax2.legend()
ax.set_title('Station: {}, ({},{}) [{}]'.format(station_props[nearest_indx][3].replace(' ',''),
                                             station_props[nearest_indx][9].replace(' ',''),
                                             station_props[nearest_indx][10].replace(' ',''),
                                             station_props[nearest_indx][4].replace('  ','')))
plt.savefig(station_props[nearest_indx][3].replace(' ','')+'_test_plot.png',dpi=300,facecolor=[252.0/255.0,252.0/255.0,252.0/255.0])
plt.show()

ftp.close() # close ftp connection

The code above is a lot to take in. It handles all of the aforementioned processes, as well as some simple data parsing and handling. Perhaps the most interesting is the calculation of relative humidity. It involves a few lines of pressure and temperature calculations, which arrive at an approximate value for relative humidity. The resulting plot should be nearly identical to the one below, depending on the data type and station selection.

NYC_test_plot.png

Notice that we have one-minute resolution data (with dropped points, of course) for an entire month. This produces up to 40k data points, which is a fairly large amount of data to work with from a single weather station.


Conclusion

This tutorial focused on Python’s file transfer protocol (FTP) for parsing weather station data available as open source data from the National Climatic Data Center (NCDC). The flexibility of the codes presented above allows users to parse information completely by scripting. This allows for automated visualization and analysis of weather data across the country (the world as well!). FTP is a powerful tool that facilitates this type of analysis, and allows programmers to look at large amounts of data without needing to manually scroll and parse through it all. This tutorial was meant as an introduction to the capabilities of Python’s FTP library, while also showing a real-world example of how to use FTP methods and the data downloaded in an automated fashion.

Citation for This Page:
 

See More in Python and Programming: