Sticky post

What Is Federated Learning?

Standard machine learning approaches require centralizing the training data on one machine or in a datacenter. And Google has built one of the most secure and robust cloud infrastructures for processing this data to make our services better. Now for models trained from user interaction with mobile devices, we’re introducing an additional approach: Federated Learning. Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device, decoupling the ability to do machine learning from the need to store the data in the cloud. This goes beyond the use of local models that … Continue reading What Is Federated Learning?

Sticky post

Predict prices using regression with ML.NET

You can follow along or download the source code here. Create a console application Create a .NET Core Console Application called “TaxiFarePrediction”. Create a directory named Data in your project to store the data set and model files. Install the Microsoft.ML NuGet Package:In Solution Explorer, right-click the … Continue reading Predict prices using regression with ML.NET

Sticky post

Create an IoT hub using the Azure portal

Create an IoT hub This section describes how to create an IoT hub using the Azure portal. Sign in to the Azure portal. From the Azure homepage, select the + Create a resource button, and then enter IoT Hub in the Search the Marketplace field. Select IoT Hub from the search results, and then select Create. On the Basics tab, complete the fields as follows: Subscription: Select the subscription to use for your hub. Resource Group: Select a resource group or create a new one. To create a new one, select Create new and fill in the name you want to use. To use an existing resource group, select that resource group. For more information, … Continue reading Create an IoT hub using the Azure portal

Sticky post

Deploy and run an IoT device simulation in Azure

Deploy Device Simulation When you deploy Device Simulation to your Azure subscription, you must set some configuration options. Sign in to azureiotsolutions.com using your Azure account credentials. Click the Device Simulation tile: Click Try now on the Device Simulation description page: On the Create Device Simulation solution page, enter a unique Solution name. Select the Subscription and Region you want to use to deploy the solution accelerator. Typically, you choose the region closest to you. You must be a global administrator or user in the subscription. Check the box to deploy an IoT hub to use with your Device Simulation solution. You can always change the IoT hub your simulation uses later. Click Create to begin provisioning … Continue reading Deploy and run an IoT device simulation in Azure

Sticky post

How Azure Machine Learning works: Architecture and concepts

How Azure Machine Learning works: Architecture and concepts Workflow The machine learning model workflow generally follows this sequence: Train Develop machine learning training scripts in Python or with the visual designer. Create and configure a compute target. Submit the scripts to the configured compute target to run in that environment. During training, the scripts can read from or write to datastore. And the records of execution are saved as runs in the workspace and grouped under experiments. Package – After a satisfactory run is found, register the persisted model in the model registry. Validate – Query the experiment for logged metrics from the current and past runs. If the metrics don’t indicate a desired outcome, … Continue reading How Azure Machine Learning works: Architecture and concepts

Sticky post

Introduction Into Azure Machine Learning

Azure Machine Learning is a platform for operating machine learning workloads in the cloud. Built on the Microsoft Azure cloud platform, Azure Machine Learning enables you to manage: Scalable on-demand compute for machine learning workloads. Data storage and connectivity to ingest data from a wide range sources. Machine learning workflow orchestration to automate model training, deployment, and management processes. Model registration and management, so you can track multiple versions of models and the data on which they were trained. Metrics and monitoring for training experiments, datasets, and published services. Model deployment for real-time and batch inferencing. Azure Machine Learning workspaces … Continue reading Introduction Into Azure Machine Learning

Sticky post

Deploy a machine learning model with the designer

Create a real-time inference pipeline To deploy your pipeline, you must first convert the training pipeline into a real-time inference pipeline. This process removes training modules and adds web service inputs and outputs to handle requests. Create a real-time inference pipeline Above the pipeline canvas, select Create inference pipeline > Real-time inference pipeline.Your pipeline should now look like this:When you select Create inference pipeline, several things happen: The trained model is stored as a Dataset module in the module palette. You can find it under My Datasets. Training modules like Train Model and Split Data are removed. The saved trained model is added back into the pipeline. Web Service Input and Web Service … Continue reading Deploy a machine learning model with the designer

Sticky post

Predict automobile price with the designer: A regression problem

Create a new pipeline Azure Machine Learning pipelines organize multiple machine learning and data processing steps into a single resource. Pipelines let you organize, manage, and reuse complex machine learning workflows across projects and users. To create an Azure Machine Learning pipeline, you need an Azure Machine Learning workspace. In this section, you learn how to create both these resources. Create a new workspace In order to use the designer, you first need an Azure Machine Learning workspace. The workspace is the top-level resource for Azure Machine Learning, it provides a centralized place to work with all the artifacts you … Continue reading Predict automobile price with the designer: A regression problem

What is Azure Machine Learning?

Azure Machine Learning can be used for any kind of machine learning, from classical ml to deep learning, supervised, and unsupervised learning. Whether you prefer to write Python or R code or zero-code/low-code options. Start training on your local machine and then scale out to the cloud. The service also interoperates with popular open-source tools, such as PyTorch, TensorFlow, and scikit-learn. Machine learning tools to fit each task Azure Machine Learning provides all the tools developers and data scientists need for their machine learning workflows, including: The Azure Machine Learning designer (preview): drag-n-drop modules to build your experiments and then deploy pipelines. … Continue reading What is Azure Machine Learning?

MLOps: Model management, deployment and monitoring with Azure Machine Learning

learn how to use Azure Machine Learning to manage the lifecycle of your models. Azure Machine Learning uses a Machine Learning Operations (MLOps) approach. MLOps improves the quality and consistency of your machine learning solutions. Azure Machine Learning provides the following MLOps capabilities: Create reproducible ML pipelines. Pipelines allow you to define repeatable and reusable steps for your data preparation, training, and scoring processes. Register, package, and deploy models from anywhere and track associated metadata required to use the model. Capture the governance data required for capturing the end-to-end ML lifecycle, including who is publishing models, why changes are being made, … Continue reading MLOps: Model management, deployment and monitoring with Azure Machine Learning

Sticky post

Monitor SQL performance in Linux

When you run SQL Server 2017 on a Linux server, you cannot use Windows Performance Monitor to gather and display performance counters because Performance Monitor is not supported on Linux. Suppose you are a database administrator for a global novelty goods importer called Weed World Importers. You have migrated your customer-facing product database to a Linux server. Recently, some users have complained of slow performance. You want to use Azure Data Studio to display SQL Server performance counters and how they vary over time. Start by logging into the database server: Sign into the VM with a Microsoft account, or … Continue reading Monitor SQL performance in Linux

Sticky post

Classify Images of Clouds in the Cloud with AutoML Vision

AutoML Vision helps developers with limited ML expertise train high quality image recognition models. Once you upload images to the AutoML UI, you can train a model that will be immediately available on GCP for generating predictions via an easy to use REST API. In this lab you will upload images to Cloud Storage and use them to train a custom model to recognize different types of clouds (cumulus, cumulonimbus, etc.). Set up AutoML Vision AutoML Vision provides an interface for all the steps in training an image classification model and generating predictions on it. Start by enabling the Cloud … Continue reading Classify Images of Clouds in the Cloud with AutoML Vision

Sticky post

Machine Learning with Spark on Google Cloud Dataproc

In this post you will learn how to implement logistic regression using a machine learning library for Apache Spark running on a Google Cloud Dataproc cluster to develop a model for data from a multivariable dataset. Google Cloud Dataproc is a fast, easy-to-use, fully-managed cloud service for running Apache Spark and Apache Hadoop clusters in a simple, cost-efficient way. Cloud Dataproc easily integrates with other Google Cloud Platform (GCP) services, giving you a powerful and complete platform for data processing, analytics and machine learning . Apache Spark is an analytics engine for large scale data processing. Logistic regression is available as a … Continue reading Machine Learning with Spark on Google Cloud Dataproc

Sticky post

Azure Platform for Data Engineers(part-2)

Explore data types: Azure provides many data platform technologies to meet the needs of common data varieties. It’s worth reminding ourselves of the two broad types of data: structured data and nonstructured data. Structured data In relational database systems like Microsoft SQL Server, Azure SQL Database, and Azure SQL Data Warehouse, data structure is defined at design time. Data structure is designed in the form of tables. This means it’s designed before any information is loaded into the system. The data structure includes the relational model, table structure, column width, and data types. Relational systems react slowly to changes in … Continue reading Azure Platform for Data Engineers(part-2)

Azure Platform for Data Engineers(part-1)

Over the last 30 years, we’ve seen an exponential increase in the number of devices and software that generate data to meet current business and user needs. Businesses store, interpret, manage, transform, process, aggregate, and report this data to interested parties. These parties include internal management, investors, business partners, regulators, and consumers. Data consumers view data on PCs, tablets, and mobile devices that are either connected or disconnected. Consumers both generate and use data. They do this in the workplace and during leisure time with social media applications. Business stakeholders use data to make business decisions. Consumers use data to … Continue reading Azure Platform for Data Engineers(part-1)

Sticky post

Cloud Engineering: Creating a Virtual Machine

Introduction: Google Compute Engine lets you create virtual machines running different operating systems, including multiple flavors of Linux (Debian, Ubuntu, Suse, Red Hat, CoreOS) and Windows Server, on Google infrastructure. You can run thousands of virtual CPUs on a system that has been designed to be fast and to offer strong consistency of performance. Here, you’ll learn how to create virtual machine instances of various machine types using the Google Cloud Platform (GCP) Console and using the gcloud command line. You’ll also learn how to connect an NGINX web server to your virtual machine. You should type the commands to reinforce their … Continue reading Cloud Engineering: Creating a Virtual Machine

Sticky post

Get To Know About Big Data Analytics

Storing and Accessing Data, Comparison An RDBMS system keeps your table definitions (that is, the schema) in a data dictionary, which is tightly coupled with your tables: it’s always kept in exact alignment, accurately describing the tables you create. This tight coupling also means that the schema governs what is allowed to be stored as data. These systems are called schema on write because the schema is applied before the data is stored. Databases manage all insertions and updates, and they typically throw an error if you try to do something like insert a character string value into a numeric column. If the data doesn’t fit … Continue reading Get To Know About Big Data Analytics

Sticky post

How to Use AutoML and Vision API in GCP

What is the Vision API and what can it do? The vision API is an API that uses machine learning and other Google services to extract information from images. The sorts of predictions that it can currently make include but are not limited to the following list: Label Detection, which is used to detect the presence of certain broad classes of objects within images Text Detection, which can be used to extract text from images, a process that also is referred to as OCR Safe Search Detection, which can be used to check if an image is safe to serve … Continue reading How to Use AutoML and Vision API in GCP

Sticky post

Developers Guide into Neo4j(present and future of database)

As a developer, you will create Neo4j Databases, add and update data in them, and query the data. When you learn to use Neo4j as a developer, you have three options⎼ Neo4j Desktop, Neo4j Aura, or Neo4j Sandbox. In this module you will learn how to use each of these development environments and select the option that is best for your needs while you are learning about Neo4j. Many graph-enabled applications have been developed and deployed using Neo4j’s Community Edition (free). If your enterprise requires production features such as failover, clustering, monitoring, advanced access control, secure routing, etc. you will … Continue reading Developers Guide into Neo4j(present and future of database)

Sticky post

insights from E-Commerce retail data set

We are using Bigquery as our data warehouse solution and using standard SQL as query language . For dataset we use Google’s Google Analytics logs of an merchants website. You need to enable your bigquery account which has a daily limit and there after it is cost effective. Click Navigation menu > BigQuery. Click Done. BigQuery public datasets are not displayed by default in the BigQuery web UI. To open the public datasets project, open https://console.cloud.google.com/bigquery?p=data-to-insights&page=ecommerce in a new browser window. In the left pane, in the Resource section, click data-to-insights. In the right pane, click Pin Project. Explore ecommerce data Problem :  Your data analyst team exported … Continue reading insights from E-Commerce retail data set

BigQuery ML(move your model towards data and not data towards model)

Overview BigQuery ML enables users to create and execute machine learning models in BigQuery using standard SQL queries. BigQuery ML democratizes machine learning by enabling SQL practitioners to build models using existing SQL tools and skills. BigQuery ML increases development speed by eliminating the need to move data. BigQuery ML functionality is available by using: The BigQuery web UI The bq command-line tool The BigQuery REST API An external tool such as a Jupyter notebook or business intelligence platform Data Analyst is a Machine learning engineer now? Machine learning on large data sets requires extensive programming and knowledge of ML frameworks. These … Continue reading BigQuery ML(move your model towards data and not data towards model)

Sticky post

.Net framework and Apache spark

Why choose .NET for Apache Spark? .NET for Apache Spark empowers developers with .NET experience or code bases to participate in the world of big data analytics. .NET for Apache Spark provides high performance APIs for using Spark from C# and F#. With C# and F#, you can access: DataFrame and SparkSQL for working with structured data. Spark Structured Streaming for working with streaming data. Spark SQL for writing queries with SQL syntax. Machine learning integration for faster training and prediction (that is, use .NET for Apache Spark alongside ML.NET). .NET for Apache Spark is compliant with .NET Standard, a formal … Continue reading .Net framework and Apache spark

Sticky post

Analyzing logs in real time using Fluentd and BigQuery

This tutorial shows how to log browser traffic and analyze it in real time. This is useful when you have a significant amount of logging from various sources and you want to debug issues or generate up-to-date statistics from the logs. The tutorial describes how to send log information generated by an NGINX web server to BigQuery using Fluentd, and then use BigQuery to analyze the log information. It assumes that you have basic familiarity with Google Cloud Platform (GCP), Linux command lines, application log collection, and log analysis. Build dataprep skills real time with me here. Introduction Logs are a powerful … Continue reading Analyzing logs in real time using Fluentd and BigQuery

What is quantum computing?

There are some problems so difficult, so incredibly vast, that even if every supercomputer in the world worked on the problem, it would still take longer than the lifetime of the universe to solve. Quantum computers hold the promise to solve some of our planet’s biggest challenges – in environment, agriculture, health, energy, climate, materials science, and problems we’ve not yet even imagined. The impact of quantum computers will be far-reaching and have as great an impact as the creation of the transistor in 1947, which paved the way for today’s digital economy. Quantum computing harnesses the unique behavior of … Continue reading What is quantum computing?

How to use Cloud Storage and Cloud SQL

In this post, you create a Cloud Storage bucket and place an image in it. You’ll also configure an application running in Compute Engine to use a database managed by Cloud SQL. For this lab, you will configure a web server with PHP, a web development environment that is the basis for popular blogging software. Outside this lab, you will use analogous techniques to configure these packages. You also configure the web server to reference the image in the Cloud Storage bucket. Objectives In this lab, you learn how to perform the following tasks: Create a Cloud Storage bucket and … Continue reading How to use Cloud Storage and Cloud SQL

Sticky post

Use Of Cloud IoT Core

Use IoT Core to create a registry Use IoT Core to create a device Use Stackdriver Logging to view device logs Enable APIs In this section, you check that all the APIs you will use in this lab are enabled. In the GCP Console, on the Navigation menu (), click APIs & Services. Scroll down and confirm that your APIs are enabled. Cloud IoT API Cloud Pub/Sub API Container Registry API If an API is disabled, click Enable APIs and services at the top, search for the API by name, and enable it for your project. Make sure you are in the correct Qwiklabs project. … Continue reading Use Of Cloud IoT Core

Sticky post

Using Pubsub to publish messages

Google Cloud Pub/Sub is a messaging service for exchanging event data among applications and services. A producer of data publishes messages to a Cloud Pub/Sub topic. A consumer creates a subscription to that topic. Subscribers either pull messages from a subscription or are configured as webhooks for push subscriptions. Every subscriber must acknowledge each message within a configurable window of time. the GCP console opens in this tab.Note: You can view the menu with a list of GCP Products and Services by clicking the Navigation menu at the top-left, next to “Google Cloud Platform”.  The Google Cloud Shell Activate Google Cloud Shell Google … Continue reading Using Pubsub to publish messages

Sticky post

IOT Sensors and connections

A sensor is a module that observes changes in its environment and sends information about these changes to a device. Devices collect data from sensors and send it to the cloud. Devices can be very small and have very few resources in terms of compute, storage, and so on. They might be able to communicate only through networks that cannot reach a cloud platform directly, such as over Bluetooth Low Energy (BLE). Standard devices are more likely to resemble small computers and may have the ability to store, process, and analyze data before sending it to the cloud. There are … Continue reading IOT Sensors and connections

IOT in GCP

Security is critical when deploying and managing an IoT network. Cloud IoT Core has several security features to protect your IoT network. Devices are authenticated individually. Which means if there is an attack on your IoT network it is limited to one device and not the whole fleet. There are four public key formats available for devices: RS256 and RSA256_X509, and ES256 and ES256_X509. You specify the key format when creating the device. You can also define an expiration time for each device credential (public key). After it expires, the key is ignored but not automatically deleted. If you don’t … Continue reading IOT in GCP

Sticky post

Cloud IOT Core

Cloud IoT Core is a fully managed service that allows you to easily and securely connect, manage, and ingest data from millions of globally dispersed devices. Cloud IoT Core, in combination with other services on Google Cloud platform, provides a complete solution for collecting, processing, analyzing, and visualizing IoT data in real time to support improved operational efficiency. You will transmit telemetry messages from a device and the device will respond to configuration changes from a server based on real-time data. The devices in this system publish temperature data to their telemetry feeds, and a server consumes the telemetry data … Continue reading Cloud IOT Core

Sticky post

Query GitHub data using BigQuery

BigQuery is Google’s fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes of data without needing a database administrator or any infrastructure to manage. BigQuery uses familiar SQL and a pay-only-for-what-you-use charging model. BigQuery allows you to focus on analyzing data to find meaningful insights. In this post we’ll see how to query the GitHub public dataset to grab hands on experience with it. Sign-in to Google Cloud Platform console (console.cloud.google.com) and navigate to BigQuery. You can also open the BigQuery web UI directly by entering the following URL in your browser. Accept the terms of service. … Continue reading Query GitHub data using BigQuery

Sticky post

Recommend Products using ML with Cloud SQL and Dataproc

As our goal is to provide demo that is why we are using the Cloud SQL or else yo can use spanner for horizontal scaling. our goal is to Create Cloud SQL instance Create database tables by importing .sql files from Cloud Storage Populate the tables by importing .csv files from Cloud Storage Allow access to Cloud SQL Explore the rentals data using SQL statements from CloudShell  the GCP console opens in this tab.Note: You can view the menu with a list of GCP Products and Services by clicking the Navigation menu at the top-left, next to “Google Cloud Platform”.  you populate rentals … Continue reading Recommend Products using ML with Cloud SQL and Dataproc

Sticky post

Dimensionality reduction using sklearn a way of reducing burden

Principal component analysis (PCA): PCA is used to decompose a multivariate dataset in a set of successive orthogonal components that explain a maximum amount of the variance. In scikit-learn, PCA is implemented as a transformer object that learns n components in its fit method, and can be used on new data to project it on these components. PCA centers but does not scale the input data for each feature before applying the SVD. The optional parameter parameter whiten=True makes it possible to project the data onto the singular space while scaling each component to unit variance. The PCA object also provides a probabilistic interpretation of the PCA that can give a likelihood … Continue reading Dimensionality reduction using sklearn a way of reducing burden

Sticky post

Machine Learning crash course (Tensorflow Examples)

machine learning comes with the learning pattern which is supervised learning at a first glance .so here is a brief about it terms used here are : the very first thing needs to keep in mind is framing your machine learning model/projects means what you want to achieve out of the data. example may contains as follows: A regression model predicts continuous values. For example, regression models make predictions that answer questions like the following: What is the value of a house in California? What is the probability that a user will click on this ad? A classification model predicts discrete values. For example, … Continue reading Machine Learning crash course (Tensorflow Examples)

Sticky post

Spark Cluster Overview

Apache Spark is a fast and general-purpose cluster computing system. It provides high-level APIs in Java, Scala, Python and R, and an optimized engine that supports general execution graphs. It also supports a rich set of higher-level tools including Spark SQL for SQL and structured data processing, MLlib for machine learning, GraphX for graph processing, and Spark Streaming. Security in Spark is OFF by default. This could mean you are vulnerable to attack by default. Spark uses Hadoop’s client libraries for HDFS and YARN.  Users can also download a “Hadoop free” binary and run Spark with any Hadoop version by augmenting Spark’s classpath. Scala and Java users can … Continue reading Spark Cluster Overview

Sticky post

Be different build a machine learning model with some extra line in your SQL query and grab attention

By the introduction you probably get it and yes we are talking about Biguery ML . BigQuery ML enables users to create and execute machine learning models in BigQuery using standard SQL queries. BigQuery ML democratizes machine learning by enabling SQL practitioners to build models using existing SQL tools and skills. BigQuery ML increases development speed by eliminating the need to move data. SEND FEEDBACK BigQuery ML  Documentation Introduction to BigQuery ML Overview BigQuery ML enables users to create and execute machine learning models in BigQuery using standard SQL queries. BigQuery ML democratizes machine learning by enabling SQL practitioners to … Continue reading Be different build a machine learning model with some extra line in your SQL query and grab attention

Sticky post

Build A Tool in the Google docs that read the sentiment of your document by using Google’s Natural Language API

The Natural Language API is a pretrained machine learning model that can analyze syntax, extract entities, and evaluate the sentiment of text. It can be called from Google Docs to perform all of these functions. This post will walk you through calling the Natural Language API to recognize the sentiment of selected text in a Google Doc and highlight it based on that sentiment. What are we going to be building? Once this post is complete, you will be able to select text in a document and mark its sentiment, using a menu choice, as shown below. Text will be highlighted in … Continue reading Build A Tool in the Google docs that read the sentiment of your document by using Google’s Natural Language API

Sticky post

Build simple Apps that can convert text-to-speech and speech-to-text but in c#

As a developer back in 2017 I always wonder it will be nice to write Machine learning code in c# .Net framework to show my manager that i know enough to become Team Lead but past is past and i left that productive company most of the company manager’s in the world are same full with dull insights as they tries to bring people down and demotivate them from their goal as they didn’t get their anyways the other day i was searching memes in the internet and all of a sudden one of the website gives me two HD … Continue reading Build simple Apps that can convert text-to-speech and speech-to-text but in c#

Sticky post

Become A Marketing Expert By using Google Cloud products learn the Art of Asking with a Browser

First thing first We will discuss what is Bigquery and why we are choosing Bigquery….. BigQuery is Google’s fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes and terabytes of data without having any infrastructure to … Continue reading Become A Marketing Expert By using Google Cloud products learn the Art of Asking with a Browser

Sticky post

Bayes Classification with Cloud Datalab, Spark, and Pig on Google Cloud

Note: If you are really following with post this job can take upto 1:30 hours to finish and if you stuck in a typo it will increase your resistance power In this post you will learn how to deploy a … Continue reading Bayes Classification with Cloud Datalab, Spark, and Pig on Google Cloud

Sticky post

Building an IoT Analytics Pipeline on Google Cloud Platform step by step

let’s start with the definition of IoT: The term Internet of Things (IoT) refers to the interconnection of physical devices with the global Internet. These devices are equipped with sensors and networking hardware, and each is globally identifiable. Taken together, these capabilities … Continue reading Building an IoT Analytics Pipeline on Google Cloud Platform step by step

Sticky post

Cloud ML Engine Your Friend on cloud

What we are doing here. Theory of Not relativity but cloud ml engine a bit of tensorflow(not stack overflow) and hands on in Create a TensorFlow training application and validate it locally. Run your training job on a single worker instance in the cloud. Run your training job as a distributed training job in the cloud. Optimize your hyperparameters by using hyperparameter tuning. Deploy a model to support prediction. Request an online prediction and see the response. Request a batch prediction. What We are building here: a wide and deep model for predicting income category based on United States Census … Continue reading Cloud ML Engine Your Friend on cloud

Sticky post

Visualizing BigQuery data in a Jupyter notebook with SQL

BigQuery is a petabyte-scale analytics data warehouse that you can use to run SQL queries over vast amounts of data in near realtime. Data visualization tools can help you make sense of your BigQuery data and help you analyze the data interactively. You can use visualization tools to help you identify trends, respond to them, and make predictions using your data. In this tutorial, you use the BigQuery Python client library and Pandas in a Jupyter notebook to visualize data in the BigQuery natality sample table. SEND FEEDBACK BigQuery Visualizing BigQuery data in a Jupyter notebook Contents Objectives Costs Before you begin Setting … Continue reading Visualizing BigQuery data in a Jupyter notebook with SQL

Sticky post

Analyzing Financial Time Series Using BigQuery and Cloud Datalab

This solution illustrates the power and utility of BigQuery and Cloud Datalab as tools for quantitative analysis. The solution provides an introduction (this document) and gets you set up to run a notebook-based Cloud Datalab tutorial. If you’re a quantitative analyst, you use a … Continue reading Analyzing Financial Time Series Using BigQuery and Cloud Datalab

Sticky post

A/B testing

The A/B test (also known as a randomised controlled trial, or RCT, in the other sciences) is a powerful tool for product development. some motivations: With the rise of digital marketing led by tools including Google Analytics, Google Adwords, and Facebook Ads, a key competitive advantage for businesses is using A/B testing to determine effects of digital marketing efforts. Why? In short, small changes can have big effects. This is why A/B testing is a huge benefit. A/B Testing enables us to determine whether changes in landing pages, popup forms, article titles, and other digital marketing decisions improve conversion rates … Continue reading A/B testing

Sticky post

Binomial Random Variables: Introduction

Binomial Random Variables So far, in our discussion about discrete random variables, we have been introduced to: The probability distribution, which tells us which values a variable takes, and how often it takes them. The mean of the random variable, which tells us the long-run average value that the random variable takes. The standard deviation of the random variable, which tells us a typical (or long-run average) distance between the mean of the random variable and the values it takes. We will now introduce a special class of discrete random variables that are very common, because as you’ll see, they … Continue reading Binomial Random Variables: Introduction

Sticky post

How To Distribute Sample

Sampling Distributions Introduction Already on several occasions we have pointed out the important distinction between a population and a sample. In Exploratory Data Analysis, we learned to summarize and display values of a variable for a sample, such as displaying the blood types of 100 randomly chosen U.S. adults using a pie chart, or displaying the heights of 150 males using a histogram and supplementing it with the sample mean (X¯) and sample standard deviation (S). In our study of Probability and Random Variables, we discussed the long-run behavior of a variable, considering the population of all possible values taken by that variable. For example, we … Continue reading How To Distribute Sample

Sticky post

TensorFlow Machine Learning on the Amazon Deep Learning AMI

TensorFlow is a popular framework used for machine learning. The Amazon Deep Learning AMI comes bundled with everything you need to start using TensorFlow from development through to production. In this post, you will develop, visualize, serve, and consume a TensorFlow machine learning model using the Amazon Deep Learning AMI.  Objectives Upon completion of this post you will be able to: Create machine learning models in TensorFlow Visualize TensorFlow graphs and the learning process in TensorBoard Serve trained TensorFlow models with TensorFlow Serving Create clients that consume served TensorFlow models, all with the Amazon Deep Learning AMI Prerequisites You should be familiar … Continue reading TensorFlow Machine Learning on the Amazon Deep Learning AMI