“One of our strategic goals is to deliver rich, on-demand business insights through big data.”Google and Informatica are deepening their partnership to extend functions enabling support for cloud data warehouse management and analytics on marketing data lakes.Informatica is expanding its offering on Google’s cloud platform to include enhanced Google BigQuery support for pushdown optimization that enables its clients to process large workloads within BigQuery.Pushdown optimization moves data transformation processing down into a relational databases.When target data and source data both reside in a relational database, processing can be pushed down into that database.Informatica Intelligent Cloud Services (IICS) will also be made available to Google Cloud Platform customers allowing them to oversee data integration, data governance and data quality.
Microsoft executives say the next frontier in artificial intelligence will involve using human professionals’ expertise to train machine learning models.“Machine teaching” is essentially an interface that sits atop a machine learning layer to give people without the ability to code a way to train and deploy systems.Machine teaching can be applied to a number of areas, including text classification, conversational dialog, computer vision, and robotics, Microsoft VP for Business AI Gurdeep Pall told VentureBeat in a phone interview.Recent AI startup acquisitions and years of Microsoft Research work will fuel initiatives to make machine teaching more widely available in the future, he said.Initiatives in these areas will rely on nearly a decade of work by Microsoft Research and AI Group, existing tools like the company’s AirSim simulator, and Bonsai and Lobe, San Francisco Bay Area AI startups Microsoft acquired last year.Lobe focuses on deep learning without code, while Bonsai is designed to help enterprises train systems used in manufacturing, building management, and robotics.
Google Cloud now is now using Anthos and AutoML to differentiate itself from market leaders Amazon Web Services (AWS) and Microsoft Azure, chief AI scientist Andrew Moore and product management director Rajen Sheth told VentureBeat.“If I’m running a small business or startup that is depending on the cloud provider’s technology, if I go with Google I can actually sell to customers the ability to run their models on premise, or on GCP, or other clouds, so this huge flexibility helps,” Moore told VentureBeat in a conversation with reporters Thursday.Google Cloud got a lot more flexible this week with the introduction of Anthos, a hybrid cloud management system that connects with AWS and Azure.In an increasingly competitive cloud market, Google made its pitch to the world by releasing dozens of new products and services at the Next conference in San Francisco this week.Developers with little coding experience can use AutoML, while AI Platform is for data scientists — part of Google’s attempt to deliver AI tools for creators across a spectrum of experiences.“Spreadsheets are a key tool for business analysts to generate value from data, and that connection makes it far easier for them to work on large volumes of information while using tools they are already comfortable with,” he said.
Supermetrics for BigQuery is the first plug-and-play data pipeline solution in the Google Cloud Platform marketplace.The solution is native to BigQuery, and it is designed to empower data-driven marketersHELSINKI, Finland–(BUSINESS WIRE)–April 10, 2019–Marketers can now harness the power of Google BigQuery for storing, analyzing and reporting cross-channel performance data.It is now possible to connect data sources like Facebook, Linkedin and Adobe Analytics to BigQuery with just a few clicks.Marketers can also extract data and use it in business intelligence and reporting tools without a single line of code or SQL.
Google is announcing several updates to its data analytics products at its Google Cloud Next developer conference today.The company wants to make it easier to move data to Google’s cloud, manipulate data and turn this data into insights.First, Google wants to make it easier to access all your data from Google’s platform.As the name suggests, this service lets you aggregate all your data sets in a single interface.You can then figure out which data set you’ll need to process in BigQuery for instance — BigQuery is Google’s cloud-based service for analyzing large amounts of data using SQL queries.You might need to transfer some data to Google’s cloud.
Google today announced a new services that makes the power of BigQuery, its analytics data warehouse, available in Sheets, its web-based spreadsheet tool.These so-called ‘connected sheets’ face none of the usual limitations of Google’s regular spreadsheets, meaning there are no row limits, for example.Instead, users can take a massive dataset from BigQuery, with potentially billions of rows, and turn those into a pivot table.The idea here, is to enable virtually anybody to make use of all of the data that is stored in BigQuery.That’s because from the user’s perspective, this new kind of table is simply a spreadsheet, with all of the usual functionality you’d expect from a spreadsheet.With this, Sheets becomes a frontend for BigQuery — and virtually any business user knows how to use a spreadsheet.
Google today made its biggest updates in nearly a year for AutoML with the introduction of AutoML Video and AutoML Tables for structured data, two new classes for Google’s suite of services that automate the creation of automated AI systems.Cloud AutoML for the creation of custom AI models was first introduced in January 2018.AutoML Tables is a new way for people with no coding experience to create custom AI models using structured tabular datasets.Tables can ingest data from GCP’s BigQuery data warehouse and other storage providers.“We’re also seeing in most industries things like demand forecasting, all the way through to things like price optimization.All of those are structured data problems and things AutoML Tables can be applied to,” Google Cloud senior director of product management Rajen Sheth told reporters ahead of the release.
Coinciding with the database improvements announced this morning during Google’s annual Cloud Next conference, the Mountain View company announced a slew of new capabilities heading to its data analytics portfolio.The first is Cloud Data Fusion, a fully managed and cloud-native data integration service that’s available starting this week in beta.Google’s pitching it as a way to ingest, integrate, and manipulate data using a library of open source transformations and over a hundred connectors.They’re mainly controlled through a drag-and-drop interface where data sets and pipelines are represented visually, without code.Google also introduced Data Catalog in beta, a fully managed and scalable metadata management service with a search interface for data discovery, underpinned by the same search technology that supports Gmail and Drive.It boasts a cataloging system for capturing technical and business metadata, and it integrates with Cloud DLP and Cloud IAM for privileged access and control.
As the intrigue of blockchains settles to a quiet simmer, it's time to ask: How far has the technology advanced?However, for a technology that promises to bring transparency to the business of moving money, blockchain networks are remarkably opaque.Recently, a handful of new projects have set out to make it much easier to access and query blockchain data.And by doing so, they may shed light on how far cryptocurrency projects have come and how far they still have to go.Google is certainly the biggest player to enter the blockchain search field.This month, the company announced that it has made available, through its BigQuery cloud platform, the full data sets from eight of the most active blockchain networks: Bitcoin, Bitcoin Cash, Ethereum, Ethereum Classic, Zcash, Dash, Litecoin, and Dogecoin.
Google today announced its intention to acquire Alooma, a company that allows enterprises to combine all of their data sources into services like Google’s BigQuery, Amazon’s Redshift, Snowflake and Azure.The promise of Alooma is that handles the data pipelines and manages for its users.In addition to this data integration service, though, Alooma also helps with migrating to the cloud, cleaning up this data and then using it for AI and machine learning use cases.“Here at Google Cloud, we’re committed to helping enterprise customers easily and securely migrate their data to our platform,” Google VP of engineering Amit Ganesh and Google Cloud Platform director of product management Dominic Preuss write today.“The addition of Alooma, subject to closing conditions, is a natural fit that allows us to offer customers a streamlined, automated migration experience to Google Cloud, and give them access to our full range of database services, from managed open source database offerings to solutions like Cloud Spanner and Cloud Bigtable.”Before the acquisition, Alooma had raised about $15 million, including an $11.2 million Series A round ed by Lightspeed Venture Partners and Sequoia Capital in early 2016.
Mode, a five-year-old collaborative analytics platform based in San Francisco, has raised $23 million in Series C funding led by Valor Equity Partners.Foundation Capital and REV Venture Partners, which had led Mode’s Series A and B financing rounds, respectively, also joined the round, which brings the company’s total funding to $50 million.In some ways, the investment is a bet on the continuing need for data scientists, despite the many companies that are focused on making data analysis available and understandable to a broader swath of employees, like Snowflake and BigQuery.The way Mode cofounder and CEO Derek Steer sees it, owing to today’s tools, organizations may need fewer data scientists.But they need also to better empower those individuals to effectively answer key questions, like how clients are using their product.Mode does this through an integrated SQL editor, Python, R notebooks, and visualization builder that it says give users the flexibility to choose the level of abstraction they want for a given dataset.
GCP faster and more powerful, says TelegraphThe Telegraph has gone all-in on Google Cloud Platform (GCP), ditching its hybrid cloud structure – which previously combined an electic mix of infrastructure including an on-premises data centre and elastic cloud provision from AWS – for operations based entirely on GCP, from advertising campaign analytics to image optimisation.The 160-year-old broadsheet publication has a print circulation of approximately 360,000 daily and claims to reach 23.1 million people a month across all platforms.It plans to use cloud-based machine learning capabilities to personalise the MyTelegraph app, predict the demand for physical newspapers, and classify content for journalists – including Telegraph images – to improve “discoverability” and upload speed.Lucian Craciun, Head of Engineering & Technology Platforms at The Telegraph, told Computer Business Review in a call: “We used to parse logs out of AWS with some earmarked clusters.A few years ago we started using (GCP’s cloud data warehouse for analytics) BigQuery; it was really quick and really cheap; magic.”
3
“You can also create custom labels that can consist of any keys and values you choose”Google’s enterprise data warehouse BigQuery has released new collaboration and public dataset features.BigQuery enables researchers to conduct Structured Query Language queries within Google Cloud Platform.Users can investigate their hypotheses using tools such as machine learning on private or public datasetsA simple yet practical feature that has been added is the ability to share queries with colleagues.While looking at your saved queries the user can turn on ‘Link Sharing’ which then makes your query visible to others.
AtScale today announced the close of a $50 million funding round to incorporate more machine learning into its data management service.The company helps businesses draw data from on-premise and cloud servers to break down silos, pool datasets, and connect business intelligence tools.AtScale also helps companies access data from data warehouses and online analytical processing (OLAP) systems, pulling data from services like AWS Snowflake, Google BigQuery, and Microsoft Azure SQL Data.“People want to move to the cloud for reasons we all know, but there are practical inhibitors for the global 2000, meaning all of the data management and architectures that they’ve built,” Lynch said.“That’s the real opportunity for us in the market.We can bring big data workloads to the cloud because we eliminate the friction.”
Cloud data warehouse company Snowflake Computing has raised a whopping $450 million in a round of growth funding led by Sequoia Capital, with participation from Madrona Venture Group, Redpoint Ventures, Altimeter Capital, Capital One Growth Ventures, Sutter Hill Ventures, Wing Ventures, and Meritech Capital.Founded in 2012, San Mateo-based Snowflake sells database software that runs on Amazon Web Services (AWS) and, as of a few months back, Microsoft Azure.Its core raison d’être is a repository for holding and querying data, making it available for processing and analyzing by myriad applications.It helps companies make sense of their wealth of data and spot patterns and trends, for example.There are, of course, many data warehousing solutions out there, including the likes of Microsoft’s SQL Data Warehouse, Google’s BigQuery, and Amazon Redshift, not to mention incarnations from more traditional players like Oracle and SAP.But Snowflake’s pitch is that its product has been purpose-built from the bottom up with the cloud in mind, while everyone within an organization can get “priority access” to the database.
Aclima, a San Francisco-based startup building Internet-connected air quality sensors has announced plans to integrate its mobile sensing platform into Google’s global fleet of Street View vehicles.Google uses the Street View cars to map the land for Google Maps.Starting with 50 cars in Houston, Mexico City and Sydney, Aclima will capture air quality data by generating snapshots of carbon dioxide (CO2), carbon monoxide (CO), nitric oxide (NO), nitrogen dioxide (NO2), ozone (O3), and particulate matter (PM2.5)while the Google cars roam the streets.The idea is to ascertain where there may be too much pollution and other breathing issues on a hyper local level in each metropolitan area.The data will then be made available as a public dataset on Google BigQuery.Aclima has had a close relationship with Google for the past few years and this is not its first ride in Street View cars.
A few months after releasing Bitcoin support for its BigQuery database tool, Google has debuted a new plug-in for analyzing the Ethereum ETH blockchain.Google has provided users with a new avenue for directly interacting with Ethereum data.Services already exist that show information about wallet addresses and transactions, but until now, investigating the full breadth of data stored on the blockchain has been a cumbersome process.BigQuery can also connect directly with the Parity client.Regular Chrome users are now capable of reading all of the data stored on Ethereum’s blockchain.BigQuery can show us graphs of all the Ethereum transfers, including transaction cost.
Programme aims to make public datasets (with an emphasis on geospatial material) available and ready to run machine learning tools onGoogle Cloud Platform (GCP) has added an additional five petabytes (5PB) of data storage for public datasets to its BigQuery enterprise data warehouse, which already hosts over 100 machine learning-ready public datasets.The Google Cloud Public Datasets programme, launched in 2016, works with public data providers to store copies of high-value, high-demand public datasets in GCP to make them more accessible and discoverable.It currently hosts some 3PB of data including Landsat data from the United States Geological Survey (USGS), along with Bitcoin blockchain transactions, GitHub Activity Data and Human Genome Variants.The additional storage will be available for the next five years.Shane Glass Program Manager at Google Cloud Public Dataset Program said in a blog: “We also continuing to curate and host datasets in BigQuery so users can leverage BigQuery Machine Learning to analyze data with machine learning using standard SQL queries… so that our users can JOIN their private data and the world’s public data with as little time and effort as possible.”
One key to efficient data analysis of big data is to do the computations where the data lives.In some cases, that means running R, Python, Java, or Scala programs in a database such as SQL Server or in a big data environment such as Spark.But that takes some fairly technical programming and data science skills not often found among business analysts or SQL programmers.In addition, if you have to extract, transform, and load your datasets from your data warehouse to another data store for machine learning, you introduce delays in the process.[ Get started with TensorFlow machine learning.| Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
There are still a lot of obstacles to building machine learning models and one of those is that in order to build those models, developers often have to move a lot of data back and forth between their data warehouses and wherever they are building their models.Google is now making this part of the process a bit easier for the developers and data scientists in its ecosystem with BigQuery ML, a new feature of its BigQuery data warehouse, by building some machine learning functionality right into BigQuery.Using BigQuery ML, developers can build models using linear and logistical regression right inside their data warehouse without having to transfer data back and forth as they build and fine-tune their models.And all they have to do to build these models and get predictions is to write a bit of SQL.Moving data doesn’t sound like it should be a big issue, but developers often spend a lot of their time on this kind of grunt work — time that would be better spent on actually working on their models.BigQuery ML also promises to make it easier to build these models, even for developers who don’t have a lot of experience with machine learning.
More

Top