Google notebook machine learning. Android Police

Putting a web front-end on a Google Colab notebook

Let’s say you’re a data scientist, and you’ve been asked to classify iris flowers based on their measurements (using the famous iris dataset). You’ve written some code in a Colab notebook that solves the problem; however, what you really want is to build an interactive tool, so people can classify the flowers themselves!

In this short tutorial, we are going to build an interactive tool for people to classify iris flowers by connecting a web app to a Colab notebook. The web app will collect the iris measurements from the user, send the data to our Colab notebook, where it will be classified, and then send the classification back to our web app to display to the user.

For this tutorial you will need to know basic Python and have an understanding of how to use Google Colab notebooks.

Step 1. Create your Anvil app

Creating web apps with Anvil is simple. No need to wrestle with HTML, CSS, JavaScript or PHP. We can do everything in Python.

Log in to Anvil and click ‘New Blank App’. Choose the Material Design theme.

First, name the app. Click on the name at the top of the screen and give it a name.

Step 2. Design your page

To classify the species of iris a flower comes from, we need to collect several measurements, so let’s design the user interface for entering that data.

We construct the UI by dragging-and-dropping components from the Toolbox. Let’s start by dropping a Card into our form – this will be a neat container for the other components. Then let’s add a Label and a TextBox into the card component:

Next we will set up the Label and TextBox components to collect enter the sepal length.

Select the Label we just added and, in the properties panel on the right, change the text to ‘Sepal length: ‘ and align the text to the right.

Then select the TextBox we added and change the name to sepal_length. change the placeholder text to ‘(cm)’ and align the text to the centre.

Repeat this process adding labels and text boxes for the other parameters we need: sepal width, petal length and petal width. This will capture all the information we need to classify each iris flower.

Next, let’s add a Button to run the classifier. Name it categorise_button and change the text to ‘Categorise’. Clicking this button will trigger a Python function to send the iris measurements to our Colab notebook. (We’ll set that up in a moment.)

google, notebook, machine, learning, android, police

Finally, let’s add a Label where we’ll display our results. Put it below the button, call it species_label. centre the text and untick the visible tick box in the properties panel so it doesn’t appear immediately. In step 3 we will create an event handler function that makes the label visible, and uses it to display data returned from our Colab notebook.

Our app should now look like this:

In the next step we will add some code to control what happens when a user pushes the Categorise button.

Step 3. Add a button click event

We want our categorise_button to do something when it’s clicked, so let’s add a click event.

With the button selected, go to the bottom of the properties panel. Then click the blue button with two arrows in it next to the click event box. This will open our code view and create a function called categorise_button_click. From now on, every time the button is clicked by a user, this function will be called.

We want to call a function in our Google Colab notebook, and pass it the measurements the user has entered into our web app. When the notebook returns our answer, we’ll display it as text on the species_label :

To do this we add the following:

def categorise_button_click(self, event_args): This method is called when the button is clicked # Call the google colab function and pass it the iris measurements iris_category =‘predict_iris’, self.sepal_length.text, self.sepal_width.text, self.petal_length.text, self.petal_width.text) # If a category is returned set our species if iris_category: self.species_label.visible = True self.species_label.text = The species is iris_category.capitalize

Now we have a basic UI and functionality, let’s connect our app to the code in our Google Colab notebook.

Step 4. Enable the Uplink

From the Anvil editor, let’s enable the Uplink. This gives us everything we need to connect our web app to our Colab notebook. Select the blue ‘’ button in the Sidebar Menu to open the list of available services. Then add the Uplink and click ‘Enable Server Uplink’:

This will then give us an Uplink key we can use in our Google Colab notebook, to connect to this app.

Now let’s install the Uplink in our Colab environment, and connect our script using the key we just created.

Step 5. Install the Uplink Library in our Colab Environment

In the next few steps, we will be connecting a Colab notebook to the web app we have built. For simplicity, I’ve created a notebook that already handles the iris classification for us. Make a copy of the following notebook to follow along:

In the example Google Colab notebook, I’ve written code that builds and trains a very simple classification model using scikit-learn’s built-in iris dataset and the k-nearest neighbors algorithm. How this works is beyond the scope of this tutorial, but Towards Data Science has a useful article if you’re looking for more information.

The first thing we need to do is install the anvil-uplink library in our Colab environment. Let’s add !pip install anvil-uplink to the top of our notebook.

The ! operator tells our notebook that this line is a command line script and not Python code.

Step 6. Connecting our Script

Now that the Uplink library will be installed when we start our notebook, we can connect our notebook in the same way as any other Uplink script.

Start by importing the anvil.server module:

Then connect to the Uplink:


Replace “your-uplink-key” with the Uplink key from your app.

That’s it! When we run our notebook, it will now connect to our web app via the Uplink. Next, let’s create a function we can call from our Anvil app.

Step 7. Creating a callable function

With a classification model built and trained, we can create a function that takes our iris data and returns the name of the iris species. Let’s create a predict_iris function and add @anvil.server.callable so it is available to call from our app.

@anvil.server.callable def predict_iris(sepal_length, sepal_width, petal_length, petal_width): classification = knn.predict([[sepal_length, sepal_width, petal_length, petal_width]]) return iris.target_names[classification][0]

Finally at the end of our notebook we will call the wait_forever function. This keeps our notebook running and allows our app to call functions indefinitely.

Run the notebook. You should see output like this:

Connecting to wss:// Anvil websocket open Authenticated OK

Step 8. Publishing your app

Now we have our app and script connected, all we have to do is publish our app for our colleagues to use.

Click the ‘Publish’ button at the top right of the editor, then select ‘Publish this app’ and use the public URL provided or enter your own.

That’s it, our notebook is now connected to our Anvil app and anyone with access to your web app can now interact with code in your Google Colab notebook.

Our app has one final problem, Colab notebooks shut down after a certain amount of time. This means our model won’t be available 24 ⁄7 and our app will stop working. To solve this we need to export the model from our Colab notebook and deploy it somewhere else.

Step 9. Deploying your model

There are two ways to deploy our model and keep it available for our Anvil app to use.

Firstly, we can host the model on our own computer and connect our app to the model using the Anvil Uplink. Here’s a full tutorial which shows you exactly how to do this for free:

The second solution is to deploy our model online with Anvil. This is the simplest way to deploy both our app and model. hosting the model online requires a paid account. Let me quickly show you how it works.

Downloading your model

We’ll start by going back into our Colab notebook. At the end of the cell that builds and trains the iris classification model, we’ll import the joblib library and the files module from google.colab.

. knn = KNeighborsClassifier(n_neighbors=10),y) import joblib from google.colab import files model = joblib.dump(knn, knn.skmodel)[0]

Uploading the model to our app

Now, back in the Anvil editor, let’s add the Data Files service. Select the blue ‘’ button in the Sidebar Menu and add Data Files.

Next, we can upload our model as a file by clicking the ‘Upload’ button and selecting the model we downloaded earlier.

Here’s a Gif of the full process for uploading your model:

Configuring your server environment

With our model uploaded, we need to configure our app’s server environment to include all the packages we need to use the model.

We’ll start by selecting settings icon from the Sidebar Menu and opening ‘Python versions’.

Then, in the Python version dropdown, select ‘Python 3.10’. Under ‘Base packages’, choose the ‘Machine Learning’ base image. This includes all of the packages we’ll need to run the model.

With our server environment configured, it’s time to start using our model.

Using your hosted model

Create a Server Module by selecting the App Browser in the Sidebar Menu and clicking ‘ Add Server Module’.

At the top of our server module, let’s import the joblib library we need to load our model and import load_iris from sklearn’s built-in iris dataset.

import joblib from sklearn.datasets import load_iris iris = load_iris

The same as we did in our Colab notebook, let’s define the predict_iris function that takes the flower measurements.

@anvil.server.callable def predict_iris(sepal_length, sepal_width, petal_length, petal_width):

Inside the predict_iris function, we’ll reconstruct our model using joblib.load. We will get the path to the model file on disk using data_files[‘knn.skmodel’]. Lastly, we’ll get the classification using the same code we used in our Colab notebook.

@anvil.server.callable def predict_iris(sepal_length, sepal_width, petal_length, petal_width): # Reconstruct our model model = joblib.load(data_files[‘knn.skmodel’]) # Get the classification of the iris classification = model.predict([[sepal_length, sepal_width, petal_length, petal_width]]) return iris.target_names[classification][0]

And that’s it! If we go to our app’s URL now and enter some iris measurements, our app will use our Machine Learning model that’s deployed online with Anvil.

I’ve added some images to improve the final app. To do this I simply added an image component to the app and set its source based on the returned iris classification.

Clone the App

For those of you who want to see the source code for this app:

New to Anvil?

If you’re new here, welcome! Anvil is a platform for building full-stack web apps with nothing but Python. No need to wrestle with JS, HTML, CSS, Python, SQL and all their frameworks – just build it all in Python.

Yes – Python that runs in the browser. Python that runs on the server. Python that builds your UI. A drag-and-drop UI editor. We even have a built-in Python database, in case you don’t have your own.

Why not have a play with the app builder? It’s free! Click here to get started:

Related content

  • Anvil Tutorials Get started
  • Developer Docs Start reading
  • Anvil Learning Centre Learn more

What is Google Colab?

Google Colaboratory, or Colab as most people call it, is a Cloud-based Jupyter notebook environment. It runs in your web browser (you can even run it on your favorite Chromebook) and lets anyone with internet access experiment with machine learning and coding for artificial intelligence. You can write and execute Python code, share your code and edit it simultaneously with other team members, and document everything by combining it into a single notebook with rich text, charts, images, HTML, and LaTeX.

Artificial intelligence and machine learning: A quick primer

You’ve heard about artificial intelligence (AI) and have probably heard the term machine learning (ML). While AI and ML are often used interchangeably, ML is a subset or subcategory of Artificial Intelligence. Machine learning is one of the tools or pathways to artificial intelligence, using algorithms to learn insights and recognize patterns from data.

A simple explanation of AI is computer hardware that mimics the capabilities of our own computing hardware, the human brain. By using tools like ML, artificial intelligence gains the ability to learn and make decisions without being explicitly programmed on how to make those decisions or being given all the potential outcomes. Essentially, ML takes the approach of letting a computer learn to program itself through its own experience.

If a company currently deploys AI programs, they use machine learning. ML starts with data — huge amounts of data. The controversial subject of AI-generated art is a good example, as it uses data sampling made up of other people’s artwork to train the model. Even with all that data, artificial intelligence still can’t paint like a human.

If you are a traditional programmer, you know that programming is like writing cooking recipes for a meal. When programming traditionally, you create detailed instructions telling the computer exactly what to do. The computer follows those instructions. If your code is good, it bakes the same cake you made and wrote the recipe for.

Sometimes writing code for a computer to follow isn’t possible or would be so time-consuming that the resources aren’t available to do it. There are some tasks that humans can do easily but are difficult to program computers to do, like recognizing people’s faces, knowing how to make a piece of art look like Van Gogh painted it, or telling the difference between donuts and bagels. Artificial intelligence is mostly capable of doing these things thanks to machine learning.

That’s artificial intelligence and machine learning in a nutshell. Machine learning lets AI attempt to figure things out by giving it tons of data to learn from. This takes equally huge amounts of computing power to run tests or practice the most basic code. That’s where Google Colab comes in.

Why use Google Colab?

Google has been aggressive in the field of AI research. Being a company with enormous resources, it can continually experiment and make breakthroughs in the field of Quantum AI. So, it also has a vested interest in the future of these technologies. Google’s AI framework, called TensorFlow, was made open source in 2015. This was followed by making Google’s development tool, Colaboratory, free for public use in 2017.

You heard that correctly. You have access to these things right now. Making TensorFlow and Google Colab available to the public has made education about and the development of machine learning applications easier. Even if you can’t afford the costly computational infrastructure, you can write and execute code today.

The Google Colab workspace app is installed through the Google Workspace Marketplace and integrates with Google Drive. All of your work is stored in Drive or can be loaded from your GitHub. Everything can be shared using the share settings in Google Drive, Docs, and Sheets. Your code is executed in a virtual machine that is private to your account.

Python and Jupyter can have intensive CPU and GPU workload requirements. Colab gives you free access to computing infrastructure to test and execute your code. Like many of Google’s products, there is a free tier and paid options. The free version of Colab is for students, hobbyists, and small experimental projects. As a data scientist or AI researcher, Google’s paid plans offer more compute units, faster GPUs, access to higher memory machines, and terminal access with the connected virtual machine.

If you want to learn about artificial intelligence and machine learning or have some simple Python code you want to document or experiment with, Colab requires no setup and is free to use. Google also offers various Colab Pro subscriptions that give paid users access to faster NVIDIA GPUs and compute credits for more advanced tasks.

What can you do in Google Colab?

As a programmer, these are some of the things that are possible in Colab:

  • Write, execute, and share code in Python
  • Participate in real-time collaborative coding with your team
  • Connect with GitHub to import or publish notebooks
  • Import external datasets
  • Document code that supports mathematical equations
  • Access GPU and TPU runtimes for free
  • Use preinstalled libraries like TensorFlow, Matplotlib, PlyTorch, and other ML libraries
  • Integrate with GitHub
  • Use version history similar to Google Docs
  • Train models using images, audio, and text
  • Analyze and visualize data

Google Colab vs. Jupyter Notebook

Google Colab is built on Jupyter Notebook, a fully open source product that’s also available for free. Jupyter came first, and IPYNB format notebooks are typically used for data exploration, machine learning experimentation and modeling, documenting code examples, and creating tutorials. Essentially the same things you would do in Google Colab.

So if Google Colab is a way to work with Jupyter Notebooks, what is the difference between using them traditionally or in Google Colab? These are the key differences between the two:

  • Collaboration tools: The most apparent difference comes down to why Google Colab was named Google Colab. Google’s platform provides several tools to make team collaboration easier. Besides document sharing and Cloud storage, the most important is real-time collaborative coding with other team members.
  • Software: Using Jupyter Notebook traditionally requires software installations on your local hardware. You also need to install your own libraries. Colab works 100% in your web browser, so that is the only software you need, and it’s software you already have.
  • Document sharing: Colab notebooks are stored and shared using Google Drive. Like Google Docs and Sheets, your notebooks automatically save periodically, have version history, and can be shared using the same sharing permissions. You can also share your Colab files with anyone without the other person needing to install software to see it.
  • Computing power: Traditional Jupyter Notebooks are stored locally, and code is executed using your local machine’s hardware. Even if you have a blazing-fast home computer, it’s limited in comparison to the computing power Google Colab gives you access to.

Up your ML game with Google Colab

Artificial intelligence is changing every industry it touches. If you are an aspiring data scientist, a researcher, or are interested in learning about AI, Google Colab lets you play with their toys. In the immediate future, you’ll continue seeing the powerful and persuasive subfield of Artificial Intelligence, called machine learning, spread everywhere. Thanks to Google Colab and the Jupyter open source code, you can learn to use these tools from home.

There are some prerequisites before you can code in these environments. You’ll need to know the basics of Python and be familiar with GitHub. Colab is a great place to start learning those things as well. There are many guides and tutorials for the different things you can do in Colab, so you can start your learning journey for free.

Benjamin is a business consultant, coach, designer, musician, artist, and writer, living in the remote mountains of Vermont. He has 20 years experience in tech, an educational background in the arts, and a penchant for taking things apart to find out how they work. He has a keen eye for well-designed tech products and an understanding of how they fit into our modern lives.

Comparison: Google’s AI Platform Notebooks and Paperspace’s Gradient Notebooks

David. 21 Apr 2021

Google Cloud Platform offers a set of machine learning tools called AI Platform, which includes tools for labeling data, creating pipelines (via Kubeflow), running jobs, deploying models, and creating shareable Cloud-based notebooks.

Today we’ll be looking at AI Platform Notebooks – a product that competes directly with enterprise notebooks from other public clouds such as Azure’s Machine Learning Notebooks and AWS’s SageMaker notebooks – and we’ll be comparing it to Paperspace Gradient, a product that competes on both usability and power.

Google’s lay of the land in ML notebooks

Google owns and operates a large number of companies in the machine learning space. To be clear, the target of our comparison today is Google Cloud Platform’s AI Notebooks product – not Kaggle Kernels or Google Colab – although we will dive deeper into those products at a later date.

GCP AI Notebooks (today’s comparison) are geared toward enterprise clients who need a full JupyterLab instance hosted in the Cloud (on GCP) with enterprise features like role-based access control and compliance guarantees.

Google Colab, meanwhile is a light version of JupyterLab commonly used as a scratchpad for ML engineers doing exploratory work and for sharing the latest libraries and tools with collaborators and the public.

Kaggle Kernels meanwhile is the Kaggle community’s data-science centric version of a light JupyterLab-style IDE that also supports R.

Although Colab and Kaggle Kernels have their tradeoffs, AI Notebooks stands alone as the only full version of JupyterLab that Google offers in the Cloud. Some of the other differences among Google notebook products are as follows:

How does Paperspace Gradient compare?

Paperspace is a young company compared to Google but boasts nearly 500,000 Cloud GPU users across three data center regions. Paperspace Gradient notebooks, which were introduced in early 2018, are already among the most popular Cloud notebooks, with the product officially recommended by as a Cloud notebook provider.

Paperspace Gradient notebooks offer some of the professional appeal of Google AI Platform notebooks (like powerful GPU instances, team collaboration, and building from your own container) but with many of the usability features that Kaggle Kernels and Google Colab users enjoy – like being able to startup a notebook in a few seconds and invite a collaborator with the press of a button.


Google AI Platform Notebooks are enterprise-grade notebooks best suited for those with compliance requirements, those with a need to ingest data from GCP sources like BigQuery, and those who are already in the GCP ecosystem and can take advantage of existing compute instances.

On the downside, AI Platform Notebooks requires a lot of setup time, requires GCP instances to fund notebooks, and has some confusing interface quirks that make it difficult to get up and running quickly – and even to accomplish some basic tasks.

In-depth look at Cloud AI Notebooks on GCP

Google’s effort to provide a full lifecycle of software tools for machine learning is called AI Platform.

AI platform is billed as an end-to-end machine learning life cycle and contains the following components:

AI Project Notebooks are in Google’s parlance part of the Build step. The product is different than other Google-backed notebook options such as Kaggle Notebooks or Colab in that these notebooks are backed by specific (and potentially more powerful than the P100 you get on Kaggle or the K80 from Colab) GCP instances.

As mentioned, Google has QTY 3 notebook products:

Google AI Platform Notebooks is billed as more fully featured relative to Google Colab and Kaggle Notebooks.

AI Platform Notebooks are designed with the enterprise user in mind. In fact, Kaggle has several prompts to encourage you to upgrade to AI Platform Notebooks.

So AI Platform Notebooks are Google’s highest offering in terms of notebooks.

Google Colab and Kaggle Notebooks have much wider use, but don’t enjoy the robustness and scalability of AI Notebooks.

For example, if you need to create notebooks that aren’t preemptible or that can run on more than a single K80 GPU, or any number of any other scenarios, you will need to use AI Project Notebooks to meet these requirements.

Feature Comparison

In general, Gradient Notebooks and AI Platform Notebooks offer a fully featured and managed version of JupyterLab with some additional features around data ingestion, compute management, and so forth.

Let’s compare Paperspace Gradient and AI Platform Notebooks:

There are many workarounds requiring custom code. One example uses Google’s Cloud Scheduler and another requires writing a cronjob with API access to the instance. info on Stack Overflow.

Cost Comparison

Google AI Platform notebooks run on GCP instances and pricing is fiercely difficult to predict.

Below is an attempt to cost-out Google AI Platform Notebook based on GPU type:

Paid instances from Paperspace require a subscription plan while GCP AI Notebooks do not require a subscription. Gradient subscription tiers are as follows:

Setting up a Jupyter Notebook in Paperspace Gradient

To get started with a notebook in Gradient:

  • Create a Paperspace account (link)
  • Navigate to Gradient Notebooks and select Create Notebook
  • Enter a name for the notebook, a runtime (optional), and select an instance
  • If you’ve selected a free CPU or free GPU instance, select Start Notebook and that’s it! (Paid instances require a credit card.)
  • NOTE: Paperspace offers unlimited use of free-tier CPU and GPU-backed notebooks

Startup Time

Any Cloud provider will take a few moments to spin-up a CPU or GPU instance. GCP AI Platform takes about 3 mins of provisioning to create your first resource, while Paperspace takes about 30 seconds.


Google has three Cloud notebook products – Google Colab, Kaggle Notebooks, and AI Platform Notebooks – and each has different strengths ranging from the social (e.g. Kaggle) to the powerful (AI Platform Notebooks).

AI Platform Notebooks are an enterprise offering designed for use in companies that have IT functions to monitor access control and resource consumption. The product is best-suited to companies that already have good use of GCP resources – since integration with GCP compute instances and GCP-based data management tools like BigQuery are key to AI Platform Notebooks’ usability.

Paperspace Gradient Notebooks meanwhile offer extensibility (via the Paperspace Cloud or another cluster from a public Cloud that’s been configured) and more features to get you up and running faster, including link sharing, easy collaborator management, public Run on Gradient links to share your work, simple billing, and so on.

We hope you enjoyed the comparison!

To read more in this comparison series please check out Comparison: Azure ML Notebooks and Gradient Notebooks or visit Paperspace Learn.

Paperspace Blog

Tutorials, sample apps, and more created by the Paperspace internal research team and community

How to Use Google Colab for Python (With Examples)

Google Colab is a free Jupyter notebook that allows to run Python in the browser without the need for complex configuration. It comes with Python installed and has all the main Python libraries installed. It also comes integrated with free GPUs.

In this tutorial, we will cover everything that you need to get started using Python with Google Colab.

Google Colab is truly the fastest way to start using Python on any computer.

What is Google Colab?

Google Colab is a browser-based product created by Google Research that allows to write and execute Python code without specific configuration.

Useful Keyboard Shortcuts in Google Colab

Then, to run Python code, just add code in the cell and press the play button at the left of the cell. This will run the IPython for the selected cell.

Make Your Notebook Interesting With Markdown

One of the interesting things about Jupyter Notebooks is that they allow to surround your code with relevant documentation in a digestible format.

The way to do that is by using Markdown.

To open a new markdown cell in Google Colab by pressing on Text at the top of the notebook or below any cell that you hover with your mouse, or by clicking on Insert Text cell Cell from the menu.

Then use the Markdown syntax to annotate your document.

Which makes your documentation visually compelling.

How to Know What Packages are Already Installed in Colab?

Google Colab comes with pre-installed Python libraries.

You can check what the full list of packages that are already installed in Google Colab by using pip list.

Or even search through the files using grep.

!pip list.v | grep tensorflow

How to Install Python Packages in Google Colab?

Installing a Python package in Google Colab is simple using the pip command along with the exclamation mark (!).

The exclamation mark at the start of a cell allows to run a shell command, and pip is the Python package installer that allows to install Python libraries.

Explore Your Colab Environment

Before we go further, let’s look at how to explore your environment in Google Colab.

On the left panel there are quick links that allow you to view:

  • Table of content: Table of content showing the Markdown headings
  • Find and replace: Find and replace any string or regex from the entire file
  • Variable inspector: Show all variables that are stored
  • File explorer: Files and directories available from Colab. This is where you’ll view the files of a mounted drive
  • Code snippets: Pre-built reusable code snippets
  • Search commands: Search box of the commands available from the menu
  • Terminal: In the pro version you can get access to the runtime’s terminal

Connect to Google Drive into Google Colab?

You can connect to Google Drive from Google Colab so that you can use the files already stored or even store the result of your scripts. To use the files from Google Drive into Google Colab, you need to first mount your drive.

from google.colab import drive drive.mount(‘/content/drive’)

An overlay will ask you to permit the notebook to access Google Drive files. You will need to click on “Connect to Google Drive” and follow the prompts to give access to your Google Drive.

Your files will be listed in the following directory: “content drive MyDrive”.

How to Run Magic Commands?

You can run IPython enhancements in Google Colab by running magic commands.

Examples of things that you can do with the magic commands:

  • Show all magic commands: %lsmagic
  • Run a file: %run
  • Time the execution of a cell: %%time
  • Display Matplotlib: %matplotlib inline
  • Get and set environment variable: %env
  • Better formatting of Pandas Dataframe: %load_ext google.colab.data_table

Google Colab for Machine Learning

Get Free GPUs

Google Colab provides free GPUs to speed up the training of your machine learning models.

GPUs, or Graphics Processing Units, are useful in machine learning. They allow multiple parallel processing of calculations, which is useful when training large machine learning models.

To enable free GPUs in Colab, go to:

Runtime Change Runtime Type and select the right Hardware accelerator.

GPUs are more expensive than CPUs and Google imposes limit on its use. If you don’t need GPUs, it is probably best to set the hardware accelerator to “None”.

How to Make Drive Directory on Google Colab

To create new Google Drive folder in the ‘Files’ section, mount Google drive and use !mkdir in the code shell:

To create a drive folder at a specific location in your Google Drive, use the OS module.

import OS if not OS.path.exists(‘/path/to/folder’): OS.mkdir(‘/path/to/folder’) Module in Google Colab from GitHub URLs

To import a file from GitHub into Google Colab, use the python requests library and fetch the raw URL from Git. Then write the file to your computer and import it.

import requests r = requests.get(‘’) with open(‘’, ‘w’) as f: f.write(r.text) from country import countries countries

Google Colab FAQs

The exclamation mark ( ! ) before a statement in Jupyter notebook is used to run shell commands from the underlying operating system.

Google Colab is used to run Python in a browser without the need of a complex implementation

google, notebook, machine, learning, android, police

Google Colab is a free and easy to use Jupyter Notebook that provide free GPUs, a valuable resource for machine learning tasks.

Simply register for a Google account and get full access to Google Colab where Python is already preinstalled


We introduced Google Colab to help your get started with Python. If you would like to learn python, make sure that you see my Python SEO Tutorials.

The New Form Feature in Google Colaboratory A Game Changer for Data Science and Machine Learning

As a software engineer youre likely familiar with Google Colaboratory a free webbased platform that allows you to write and run code in a Jupyter notebook environment Many data scientists and machine learning practitioners use Google Colab for its ease of use powerful computing resources and integration with popular data science libraries like TensorFlow and PyTorch

As a software engineer, you’re likely familiar with Google Colaboratory, a free web-based platform that allows you to write and run code in a Jupyter notebook environment. Many data scientists and machine learning practitioners use Google Colab for its ease of use, powerful computing resources, and integration with popular data science libraries like TensorFlow and PyTorch.

Recently, Google Colab introduced a new feature that has the potential to revolutionize the way data scientists and machine learning practitioners interact with their data: Forms. In this blog post, we’ll explore what Forms are, how to use them, and why they’re a game changer for data science and machine learning.

What are Forms?

Forms are a new feature in Google Colab that allow you to create interactive widgets within a notebook. These widgets can be used to gather user input, display information, and even trigger code execution. Forms are built using the ipywidgets library, which provides a wide range of widgets that can be customized to suit your needs.

With Forms, you can create user interfaces that make it easy for others to interact with your code. For example, you can create a form that allows users to select a dataset, choose a machine learning model, and specify hyperparameters. Once the user has filled out the form, your notebook can automatically run the code to train the model and display the results.

How to Use Forms

Using Forms in Google Colab is simple. First, you need to enable Forms in your notebook by running the following code:

!pip install.q ipywidgets from google.colab import widgets from IPython.display import display

Once Forms is enabled, you can create a form by defining a function that takes one or more arguments. Each argument is used to create a widget in the form. For example, the following code creates a form with two text input widgets:

def my_form: text1 = widgets.Text(description=’Name:’) text2 = widgets.Text(description=’Email:’) display(text1, text2)

To display the form to the user, you simply call the function:

google, notebook, machine, learning, android, police

Now, when you run the cell containing the my_form function, you’ll see a form with two text input widgets. The user can enter their name and email address, and you can access these values in your code using the value attribute of each widget:

name = text1.value email = text2.value

You can create many different types of widgets, including sliders, dropdowns, checkboxes, and more. You can also use the observe method to define a callback function that is called whenever the value of a widget changes. This allows you to create dynamic forms that update in real-time based on user input.

Why Forms are a Game Changer

Forms are a game changer for data science and machine learning because they make it easy to create user interfaces that allow others to interact with your code. This is particularly useful when working in a team or collaborating with others. With forms, you can create a user-friendly interface that abstracts away the complexity of your code, making it accessible to non-technical users.

Forms also make it easy to run experiments and iterate quickly. With a form, you can quickly try out different hyperparameters, models, and datasets without having to modify your code. This allows you to quickly explore different options and find the best solution for your problem.

Finally, forms are a great way to showcase your work. With a form, you can create an interactive demo of your project that allows others to see your results and play around with your code. This can be a powerful tool for sharing your work with others and getting feedback.


Forms are a powerful new feature in Google Colab that allow you to create interactive widgets within your notebooks. With forms, you can create user-friendly interfaces that make it easy for others to interact with your code, run experiments, and showcase your work. If you’re a data scientist or machine learning practitioner, Forms are a game changer that you should definitely check out.