Data Analysis · Matplotlib · Plotting in Python

Plot multiple lines in one chart with different style Python matplotlib

Sometimes we need to plot multiple lines in one chart using different styles such as dot, line, dash, or maybe with different colour too. It is quite easy to do that in basic python plotting using matplotlib library.

We start with the simple one, only one line:

import matplotlib.pyplot as plt

# when you want to give a label
plt.xlabel('This is X label')
plt.ylabel('This is Y label')


Let’s go to the next step, several lines with different colour and different styles.

import numpy as np
import matplotlib.pyplot as plt

# evenly sampled time at 200ms intervals
t = np.arange(0., 5., 0.2)

# red dashes, blue squares and green triangles
plt.plot(t, t, 'r--', t, t**2, 'bs', t, t**3, 'g^')

If only three lines, it seems still easy, how if there many lines, e.g: six lines.

import matplotlib.pyplot as plt
import numpy as np



ax.plot(x,x+1,c='g',marker=(8,2,0),ls='--',label='Greedy Heuristic')


Now, we can plot multiple lines with multiple styles in one chart.

These are the resources from matplotlib that may useful:

  1. Marker types of matplotlib
  2. Line styles matplotlib
  3. Matplotlib marker explanation


Experiment case using real Excel dataset and plot it to line chart with different markers automatically without define one by one for each line can be seen on this post

*Some part of the codes, I took from StackOverflow

Anaconda · Data Analysis

Python for Data Science using Anaconda

I am a lazy guy, when I already have my setup environment, it is hard for me to move on. In my laptop (my lovely MacBook), I have setup Python which has many python virtual environments (virtualenv), of course, one of my virtualenv has a complete data science libraries. I use this virtualenv when I want to do data analysis stuff. I also have an experience using docker-machine to be more productive and reproducible in analyzing the data but it is heavy on my laptop.

I have heard about Anaconda or conda which is the platform that bundles all data science libraries to one plate and you just enjoy it, but I never tried yet. Still, it is hard to move on. Then, after I have a new computer in my office and it is Windows. I wanted to start working with data using Python on my computer and I tried to remember what steps that I have to do? Installing Python, installing PIP, installing virtualenv, installing virtualenv wrapper, installing all data science libraries to one of my virtualenv, and start working.

As the lazy guy, I do not want to do that. I went to anaconda website and okey I decided to try and tarrraa!!!!!!!!

Just go to download the installer which matches with your operating system, install it then launch the Anaconda Navigator.


It was surprising me, I even can run Rstudio by using Anaconda Navigator. If you enjoy using Jupyter (IPython Notebook), you just need to press launch for the Jupyter. One of beautiful IDE to do data analysis in Python is Spider. It is really cool. Previously when I was in R, I always use Rstudio as my IDE to do data analysis and now if you want to move to Python, there is a Spider.

If you want to know what kind of data science libraries that you need to install manually if you don’t want to use Anaconda, please visit this link. The picture below describes the Python environment, it really useful for me:

Bokeh · Data Analysis · Data Mining · Keras · Machine Learning · Matplotlib · NumPy · Pandas · Plotting in Python · Ploty · SciKit-Learn · SciPy · Seaborn

Python for Data Science

I have been two years doing a processing and manipulating the data using R and mostly I use this language for my research project. I only heard and never tried Python for my data analytics before. But now, after I use Python, I really fall in love with this language. Python is very simple and it is been known that this language is the easiest one to be learned. The reason why previously I used R was this language supported by tons of libraries for scientific analysis and all of those are open source. Now, with the popularity of Python, all libraries that I need, I can find it easily in Python and all of them also open sources.

There are the core libraries that you must know when you start to do data analytics using Python:

  1. NumPy, it stands for Numerical Python. Python is different with R, the purpose of R language is for scientist and on the other side, Python is just the general programming language. So, it is needed a library which can handle numerical things such as complex arrays and matrics in Python. Repo project link:
  2. SciPy, this library is for scientific and it handles such as statistic computing, linear algebra, optimation etc. Repo project link:
  3. Pandas, when you ever play with R, it is very similar with DataFrame. By using DataFrame, we can easily to manipulate, aggregate, and doing analysis on our dataset. The data will be shown in a table like in Excel or DataFrame in R and it convenient to access the data by columns, rows or else. Repo project link:
  4. Matplotlib, Plotting is very important for data analysis. To make the data easy to read by people and we know that one picture can descript 1000 words, we absolutely need the data visualization tools. If you have experience with Excel, it is very easy, just block the table that you want to plot and select the plotting types such as Bar chart, line chart, etc. In R the most popular tools for plotting is ggplot, basically, you can use standard library ‘plot’ in R but if you want more advanced and more beautiful figure you need to use ggplot.  This library is the basic library for visualizing your data similar as I explained above, Repo project link:

Those are the core basic libraries that you need when you start to use this language for data analytics. There are still many libraries that very useful such as:

  1. SciKit-Learn, when you want to apply machine learning on your data analytics.
  2. Scrapy, to scrap the data from the internet, when you want to gather the data from websites for your analysis. I used tweepy library to collect tweets data from Twitter.
  3. NLTK, if you want to do natural language processing.
  4. Theano, Tensorflow, Keras, when you are not satisfied with NumPy performance or want to apply neural network algorithm or doing deep learning stuff, these libraries are very useful for it.
  5. Interactive Visualization Tools, matplotlib is basic plotting tools and it is enough for me as researcher especially for publications, but when you want a dynamic plotting or more interactively you can use Seaborn, Ploty, or Bokeh.


If you are a lazy guy like me to install all the stuff above, you can try to use Anaconda, it is really cool. 


See ya on the next post..

Brisbane, 24 November 2017