11/2/2023 0 Comments Google cloud python bokeh example![]() Close the terminal to disconnect from the instance.Install Docker and git on the instance.Configure inbound rules in the security group so that port 80 is enabled for hearing requests.Launch an EC2 t2.micro instance with Ubuntu Server 16.04 LTS (HVM), SSD Volume Type AMI (Amazon Machine Image).Specifically, I deploy the application using Amazon Elastic Compute Cloud (EC2) following these steps: Hence I choose AWS for my small visualization application. AWS provides a wide variety of Cloud Computing services and lots of free-tier eligible services. Google Cloud is a better fit for applications related to Big Data and Advanced Machine Learning. Azure provides excellent services when deploying applications on Windows operating systems, or related to Microsoft Office, SQL server, etc. There are numerous ways to deploy the application to the cloud, including Microsoft Azure, Google Cloud, and Amazon Web Services. ![]() ![]() yaml works with the file structure illustrated in Figure 2, the current working directory will be vggm-webapp folder. volumes specifies the shared volumes for all containers and map the current working directory on localhost to the /app directory.Once the containers are running, we can access the servers running in each container through the local ports (e.g., 5006 for Bokeh server and 80 for Flask server). Similarly, local port 80 is exposed to Flask container’s port 5000. ports binds the local port to the container port, in this case, for the Bokeh container, the local port 5006 is mapped to port 5006 on the container.dockerfile tells Docker Compose the container should be built from a Dockerfile. Within each service (container), build specifies the docker image for that specific container to build from, context.services specifies all the containers to be launched.Different format versions have different syntax and upgrades compare to the older version. version: '2' specifies that the file format version of this.The following code sets up the Flask server and pulls the plot from our Bokeh server automatically once a request is heard: Since what I need is a simple layout to mount my Bokeh visualization, I choose Flask to build the webpage.Įssentially, Flask runs a server locally that hears all the inbound requests and sends the webpage content to the client backend. On the other hand, Django is a full-stack Python web framework which takes care of many built-in modules such as admin interface and ORM database support. Compares to Django, Flask is more lightweight and has more freedom in terms of keeping the core of the web application extensible and straightforward. There are various platforms out there that provide the functionality, including Flask, Django, etc. To display the content returned by the Bokeh server on a webpage, we need to build a framework to hold the information and present it through a web browser. The extra options provided in the plot such as shading and radio check group, are supported by a Bokeh server running behind the scene. In my previous post, I introduced how to make interactive data visualizations via Bokeh server. In this post, I will demonstrate the whole deployment process and break down the steps one by one, including building docker containers, scale services using docker compose, and deploy them on AWS EC2 instances. Combined with AWS (Amazon Web Services), the deployed application is now on the cloud running 24/7. ![]() To gear myself up with the essential skills about cloud computing and software deployment, I integrated one of my data visualization applications with Flask, which is a web development framework, and dockerized both of them. While asking a data scientist to scale their machine learning models to the cloud is generally not practical several years ago, I think it is necessary for a modern data scientist to at least have some basic knowledge of the infrastructure behind the deployment process. From my point of view, however, with the fast pace of technology-growing and advancement, a good data scientist should evolve accordingly as well. Back then, during 2017-2018, when data scientists are defined as mere practitioners that have limited functionalities that only involves the workflow related to data, such as data ingestion, migration, storage, manipulation, analysis, and modeling.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |