Blog posts on Data Science, Machine Learning, Data Mining, Artificial Intelligence, Spark Machine Learning

Showing posts with label cloud. Show all posts
Showing posts with label cloud. Show all posts

Saturday, February 23, 2019

How to import data into Google Colab Jupyter Noteook

Accesing data is one of the first step that we need when performing any data analysis. In this tutorial, we will see two ways of loading data into the google colab environment.


  • Uploading csv from local machine and loading into colab
  • Loading data from google drive to colab

  • Uploading CSV from local machine using IMPORT functionality.



  • Load import files library from google colab
  • upload file using the upload button control

  • Running below commands will allow us to upload data files into the colab environment. Once the Choose Files button is visible, after executing the below listed python commands, we can easily upload files from local directory.
    from google.colab import files
    uploaded = files.upload()
    
    Saving DOLPHIN.csv to DOLPHIN.csv

    To view the uploaded files

    Below command allows us to verify if the file is uploaded correctly.
    for fn in uploaded.keys():
      print('User uploaded file "{name}" with length {length} bytes'.format(name=fn, length=len(uploaded[fn])))
    
    User uploaded file "DOLPHIN.csv" with length 117269 bytes

    Saturday, September 8, 2018

    Getting started with google laboratory for running deep learning applications

    What is Google Colab:


    We all know that deep learning algorithms improve the accuracy of AI applications to great extent. But this accuracy comes with requiring heavy computational processing units such as GPU for developing deep learning models. Many of the machine learning developers cannot afford GPU as they are very costly and find this as a roadblock for learning and developing Deep learning applications. To help the AI, machine learning developers Google has released a free cloud based service Google Colaboratory - Jupyter notebook environment with free GPU processing capabilities with no strings attached for using this service. It is a ready to use service which requires no set at all.

    Any AI developers can use this free service to develop deep learning applications using popular AI libraries like Tensorflow, Pytorch, Keras, etc.