We applied machine learning to analyze 4,607,160 data points collected by scraping the front page of Reddit for 22 days.Read post for insights on how the Reddit algorithms work.Why people are so damn interested in getting to the front page of Reddit?Look at any viral videos floating around Facebook.For example, a Reddit front page giraffe story got on CNN.com homepage in 2 hours.Our Reddit Scraping Process to get 4 million data points :
The breakthrough involves phase-change memory PCM which IBM has successfully achieved storing three bits of data per cell for the first time, compared to previous demonstrations of storing one bit per cell.PCM is much more durable than flash – it can last something like 10 million write cycles compared to the average flash USB stick which endures around 3,000 write cycles – and it's way faster, coming closer to DRAM performance, but with one big difference: it doesn't lose data when switched off like DRAM.Significant cost reductionDr Haris Pozidis, manager of non-volatile memory research at IBM Research, commented: "Phase change memory is the first instantiation of a universal memory with properties of both DRAM and flash, thus answering one of the grand challenges of our industry.The new memory tech could have implications across a range of uses, including providing blazingly quick storage for cloud and IoT applications, and boosting performance of the likes of machine learning.Businesses could see entire databases stored in PCM enabling ultra-fast querying, and of course this will also make a major difference to smartphones.IBM envisions hybrid applications with PCM running alongside traditional flash storage in a phone, but with the OS stored in the PCM so when you switch your phone on, it loads almost immediately and you're staring at your home screen before you've had time to bat an eyelid.
Google has begun to build its own custom application-specific integrated circuit ASIC chip called tensor processing units TPUs , Google chief executive Sundar Pichai said today at the Google I/O developer conference in Mountain View, California.The name is inspired by Google s TensorFlow open source deep learning framework.When you use the Google Cloud Platform, you can take advantage of TPUs as well, Pichai said.Specialty hardware — sort of taking a cue from the holographic processing unit HPU inside Microsoft s HoloLens augmented reality headset — will not be the only thing that will make the Google public cloud stand out from market leader Amazon Web Services AWS .Also, over time Google will expose more and more machine learning APIs, Pichai said.Our goal is to lead the industry on machine learning and make that innovation available to our customers, Google distinguished hardware engineer Norm Jouppi wrote in a blog post.Building TPUs into our infrastructure stack will allow us to bring the power of Google to developers across software like TensorFlow and Cloud Machine Learning with advanced acceleration capabilities.Machine Learning is transforming how developers build intelligent applications that benefit customers and consumers, and we re excited to see the possibilities come to life.
Hello from Google I/O, Google s annual developers conference.This year, Google I/O has moved from its traditional home in San Francisco to an outdoor setting: the Shoreline Amphitheater in Mountain View, California — across the street from Google s main Googleplex headquarters.It s expected official details of Google Home will be announced, Google s voice activated rival to the Amazon Echo.Machine learning will likely come up, including Google perhaps finally getting into the bot game.You can watch the keynote through the live stream at the Google I/O site or follow along as we live blog it from the event itself, when it begins at 10am PT.Our live blog is below to make the font larger, click the gear icon on the right :
Fitness apps will start automatically, and users can respond to messages with machine learning-based repliesAndroid Watches displayed on screen at Google I/O on May 18, 2016Android Wear, Google s operating system for smartwatches, is getting its biggest update yet with an upcoming 2.0 release that brings improved features for messaging and fitness.Among the improvements, Android Wear 2.0 will detect when you're starting to exercise and automatically fire up an app such as Strava.Google is also taking aim at how the Apple Watch displays information from applications, by allowing developers to display information from any app when users glance at their device.That s important for helping people to quickly see data from apps they use frequently.People who want to converse with friends from their wrists will also get new features, including a redesigned keyboard, support for handwriting recognition, and smart replies that offer machine learning-driven responses based on the context of a conversation.At a time when smartwatches are still a niche item, that could be useful for Google.
You're probably giving away more than you thinkThe location stamps on just a handful of Twitter posts can help even low-tech stalkers find you, researchers found.The notion of online privacy has been greatly diminished in recent years, and just this week two new studies confirm what to many minds is already a dismal picture.First, a study reported on Monday by Stanford University found that smartphone metadata -- information about calls and text messages, such as time and length -- can reveal a surprising amount of personal detail.Based on frequent calls to a local firearms dealer that prominently advertises AR semiautomatic rifles and to the customer support hotline of a major manufacturer that produces them, it's logical to conclude that another likely owns such a weapon.Currently, U.S. law gives more privacy protections to call content and makes it easier for government agencies to obtain metadata, in part because policymakers assume that it shouldn t be possible to infer specific sensitive details about people based on metadata alone.Many people have this idea that only machine-learning techniques can discover interesting patterns in location data, and they feel secure that not everyone has the technical knowledge to do that, said Ilaria Liccardi, a research scientist at MIT s Internet Policy Research Initiative and first author on the paper.
The location stamps on just a handful of Twitter posts can help even low-tech stalkers find you, researchers found.The notion of online privacy has been greatly diminished in recent years, and just this week two new studies confirm what to many minds is already a dismal picture.First, a study reported on Monday by Stanford University found that smartphone metadata—information about calls and text messages, such as time and length—can reveal a surprising amount of personal detail.Based on frequent calls to a local firearms dealer that prominently advertises AR semiautomatic rifles and to the customer support hotline of a major manufacturer that produces them, it s logical to conclude that another likely owns such a weapon.Currently, U.S. law gives more privacy protections to call content and makes it easier for government agencies to obtain metadata, in part because policymakers assume that it shouldn t be possible to infer specific sensitive details about people based on metadata alone.Many people have this idea that only machine-learning techniques can discover interesting patterns in location data, and they feel secure that not everyone has the technical knowledge to do that, said Ilaria Liccardi, a research scientist at MIT s Internet Policy Research Initiative and first author on the paper.
Credit: GoogleForget the CPU, GPU, and FPGA, Google says its Tensor Processing Unit, or TPU, advances machine learning capability by a factor of three generations.This is roughly equivalent to fast-forwarding technology about seven years into the future three generations of Moore s Law , the blog said.Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models, and apply these models more quickly, so users get more intelligent results more rapidly.The tiny TPU can fit into a hard drive slot within the data center rack and has already been powering RankBrain and Street View, the blog said.Analyst Patrick Moorhead of Moore Insights & Strategy, who attended the I/O developer conference, said, from what little Google has revealed about the TPU, he doesn t think the company is about to abandon traditional CPUs and GPUs just yet.He likened the comparison to decoding an H.265 video stream with a CPU versus an ASIC built for that task.
Answer by Scott Aaronson, Theoretical computer scientist at MIT, soon to be at UT Austin, on Quora.There are things like Deflategate or manspreading or the dresses worn at the Oscars, which many people talk about but few should.And then there are things like World War II, global warming, black holes, or machine learning, which many people talk about and probably many should.Indeed, I started out in AI and machine learning, as an undergrad at Cornell with Bart Selman and then as a grad student at Berkeley with Mike Jordan, before shifting into quantum computing, where I felt like my comparative advantage was greater.On the other hand, at least according to the ML researchers I know, the recent progress has not involved any major new conceptual breakthroughs: it s been more about further refinement of algorithms that already existed in the 70s and 80s, and of course, implementing those algorithms on orders-of-magnitude faster computers and training them with orders-of-magnitude more data.In the end, I suppose it s less interesting to me to look at the sheer amount of machine learning hype than at its content.
In machine learning, the ultimate goal is to train a machine or computer to learn and infer like a human, taking into account much more information and making better decisions in exponentially less time than humans are able to do.At present, most unsupervised learning systems still require some human feedback and training after initial data analysis.In contrast, unsupervised learning systems freely analyze patterns in unlabeled data, with no corresponding error or reward linked to a conclusion.Despite significant progress, the underlying processes of unsupervised learning, what s really happening at the level of the artificial neurons, is still a mystery.Risks include building models that fail or don t work in unexpected situations; however, the real gap lies in a lack of explanation by the system for its findings or results, human beings are still the ultimate interpreters.It may be that an unsupervised learning system comes up with a set of conclusions that are nonsensical or seemingly indecipherable by human beings, so finding ways to synthesize analysis with meaning is a significant code that still needs to be cracked.
And yes, it s Google assistant — lower-case assistant — not Google Assistant.We really need to help them get things done in the real world.This is why we re evolving search to be more assistive, Pichai said when speaking about Google assistant during the opening keynote of Google I/O, Google s big annual developers conference held today.No catchy name for Google inside The move still doesn t give Google s assistant a catchy name like Apple Siri or the aforementioned Alexa and Cortana assistants.Google assistant in Home & AlloFor example, the newly announced Google Home voice-activated home assistant was described as having Google assistant built in.Similarly, the new Allo messaging app was said to have Google assistant smarts, helping you automatically respond to messages or to converse with Google itself to get things done.
Google announced Android Wear, a version of its popular mobile operating system specifically for wearables, a little over two years ago.At I/O 2016 earlier today, the Alphabet-owned company showcases a preview of what it calls the biggest platform update yet.They re also redesigning key experiences to be more intuitive and adding things like smart replies, improved handwriting recognition and a new keyboard - all of which are powered by Google s machine learning.In the fitness category, Google is adding automatic activity recognition and the ability for apps to exchange data with each other using the Google Fit API.Android Wear 2.0 also expands the ways you can listen to music during a workout, even if you forget to bring along your smartphone.Google has also published a Material Design guide for Android Wear 2.0 that ll assist developers in creating apps with a uniform design.
Google Chief Executive Sundar Pichai revealed new products and services that use smarter software to make decisions rather than follow instructions, part of a major push into artificial intelligence that he said would define the tech giant over the next decade.Google, a unit of Alphabet Inc., said it would soon start selling a device called Home that will answer users questions and complete tasks for them, like scheduling appointments, playing music and sending emails.We think of it as a conversational assistant, Mr. Pichai told attendees at Google s annual developers conference, held at an outdoor concert venue near its Mountain View, Calif., headquarters.Google also said it would launch a new messaging app, called Allo, that would incorporate some of the same underlying technology as Home to create smarter conversations.Google has invested heavily in artificial intelligence in recent years to strengthen its existing products and spawn new ones.Researchers increasingly use one branch of artificial intelligence, called machine learning, to enable computers to teach themselves new skills by reviewing huge data sets.
The chip sped up the Go-playing software, allowing it to plot moves in the time-limited match and look further ahead in the game.Google, a division of Alphabet Inc., GOOGL 0.22 % has been using it for more than a year to accelerate artificial intelligence applications as the software techniques known as machine learning become increasingly important to its core businesses.Overall the chip, known as the Tensor Processing Unit, is 10 times faster than alternatives Google considered for this work, the company said.Whether it s a ton less than we would have otherwise, I can t say.It allows the company to process all the text stored in its massive collection of StreetView images—things such as street signs and address numbers attached to the sides of houses—in just five days, much faster than previous methods, Mr. Jouppi said.The chip also is used in Google search ranking, photo processing, speech recognition, and language translation.
Deemed by many as the ultimate technology for a smart IoT world, artificial intelligence AI is intelligence exhibited in software that allows machines, devices, services and other 'things' to learn and behave like a human.AI's main goal is to find solutions to problems.This is done through reasoning, planning, natural language processing, perception, and others.Other branches include, but are not limited to, genetic programming, heuristics, ontology, planning, and machine learning from experience and inference.According to Gartner, by 2020 85% of customer interactions will be managed without the need of human intervention.Over $325m have been invested in the last few years on AI startups.
The TPU is in production use across Google's cloud, including powering the RankBrain search result sorting system and Google's voice recognition services.Part of that has to do with the way application development is heading -- developers are building more and more applications in the cloud only, and don't want to worry about managing hardware configurations, maintenance and updates.Another possible reason is that Google simply doesn't want to give its rivals access to the chips, which it likely spent a lot of time and money developing.Analyst Patrick Moorhead said he expects the chip will be used for inferencing, a part of machine learning operations that doesn't require as much flexibility.Right now, that's all Google is saying.Holzle said that the company will reveal more about the chip in a paper to be released this fall.
Today at I/O, Google took the wraps off its latest foray into the world of communications: the company announced Allo, a smart messaging app supercharged with machine learning and Google s new Google Assistant service its answer to Amazon s Alexa , giving users the ability not just to chat to each other with animated graphics and enlarging/shrinking text, but to call in Google and later other third-party apps to share media, plan events, buy things, and even think of what to say to each other.The iOS and Android app is being unveiled today, but it will only be live this summer, Google says.So while there are already a number of popular messaging apps out there today like Facebook s WhatsApp and Messenger; Viber; Line; WeChat and others like Slack focusing on enterprise — it s no surprise to see Google pooling together its strong cards to see if it can make its own messaging product fly.We are building search to be much more assistive, Google s CEO Sundar Pichai noted today during the I/O keynote when unveiling Google Assistant and the many helping nudges it gives you when you are looking for information through Google.com.The featuresAs with other messaging apps, users of Allo will be able to find people to chat to based on their phone numbers, and those that use Google accounts for services like Gmail will also be able to call in their contacts from those services.In addition to emoji and a whisper / loud mode where you can enlarge and shrink text to emphasize what you re saying, there are a number of AI-based details.
Update: The Google IO liveblog keynote is being updated continually as the two-hour press conference rolls on.11:50 AM: Sundar is back on stage talking about deep learning and machine learning, and focusing on advancements like curing diseases.Google is making web development through Android Studio smarter and faster, going as far as writing some of the code.11:30 AM: Google is now shifting to Chrome with over 1 billion active users every month, and its touting its accelerated mobile pages, saying these pages load almost instantly.10:00 AM: Google IO has started with music, but not just any music - it looks to be music played through Project Jacquard, which put all sorts of sensors in clothing thank to Google's ATAP division.Here are your humble hosts for this Google IO liveblog, Global Editor-in-Chief Darren Murph, and Senior Mobile Editor, Matt Swider.
You can also link it to an existing Google account, though you don't have to.Allo also "learns over time," and can suggest smart replies through its machine learning, natural language processing, and image recognition abilities.For example, if your friend sends you a picture of a dog, Allo could suggest the response "cute dog!"All of Allo's chats will be encrypted, but in an "incognito mode" similar to what it offers on Chrome, messages will be end-to-end encrypted, meaning that Google won't be able to read them at all.Google now has three seperate messaging apps: Allo, Hangouts, and the newly launched Spaces.The key differentiator with Allo, though, is the smart assistant integration.Allo will launch on Android and iOS this summer.
Update: We've now gone hands on with the new Android Wear 2.0 platform - check out the updates to the platform that we've tried out!The UI is also enhanced - with more minimal, muted colors and a darker theme to help conserve battery life - and it definitely looks a bit sleeker too.After that, you can clumsily sketch in big letters or numbers to send an easy reply to the recipient - although you won't be able to send them a big diatribe easily.The text is recognised automatically though, and when we demoed the option it was very accurate - again, another example of the Google machine learning coming to the fore.There's also automatic activity tracking, so in theory your watch will be able to tell whether you're cycling or running and instantly start up the right app - Strava was highlighted in the presentation, meaning you'll never forget to track a little trot around the park again.Sadly, we couldn't get a definitive comment on what the minimum spec will be as this is still being nailed down before the final release later this year.
More

Top