Trend Analysis in a Candlestick Market Data Chart

AI & Machine Learning News. 10, December 2018

The Business of Selling your Location – NYT Podcast 22m Smartphone apps track a staggering amount of data about our whereabouts every day. That data has become a hot commodity. Data reviewed by The New York Times shows more than 235 million locations captured from 1.2 million unique devices in the New York area during a three-day period in 2017. 2018-12-10  Read the full story. Thasos Response to Apple Terms of Service Change • Integrity Research In response to Apple changing its Terms of Service, geolocation data provider Thasos Group decided to make the difficult decision to completely eliminate the use of iOS data in their service.  The result has been a significant decrease in compliance concerns from hedge funds who value geolocation data. In late 2016 or early 2017 Apple changed its App Store Terms of Service for App Developers to better address the growing concerns of individuals about the privacy of their data collected by iPhone apps.  In the wake of these changes, Apple has slowly started to remove third-party apps from their App Store for violating the new terms. he most concerning changes for the alternative data industry are found in Section 5.1 of the App Store Review Guidelines.  These changes can be summarized as follows:
  • App developers must obtain consent from users to collect their data. In addition, users must be informed how and where their data is being used.
  • Data collected for one purpose may not be repurposed without additional user consent.
  • Data collected by apps sold through Apple’s App Store may only be used for two reasons, to improve the app or to support the serving of advertising.
In addition to simply asking users to provide their permission to collect data, Apple has required iOS app developers to explain what the data is used for and how it is shared.  In addition, Apple has started cracking down on instances where the data is used for purposes unrelated to improving the user experience. 2018-12-05 16:09:14+00:00 Read the full story. Cloudquant Thoughts… This has been a long time coming. Tracking our every move and selling that data for profit is an unequal deal. And the idea that any attempt to secure your privacy (VPN/Faraday Cage) is an admission of some kind of guilt is a terrible idea that has been sown into society. The “I do nothing wrong so I do not see the problem” attitude is extremely short sighted (WikiPedia – Nothing To Hide Arguement). It may well be time for the goverment, slow footed as it is, to step in.     6 of my favorite case studies in Data Science! Data scientists are numbers people. They have a deep understanding of statistics and algorithms, programming and hacking, and communication skills. Data science is about applying these three skill sets in a disciplined and systematic manner, with the goal of improving an aspect of the business. That’s the data science process. In order to stay abreast of industry trends, data scientists often turn to case studies. Reviewing these is a helpful way for both aspiring and working data scientists to challenge themselves and learn more about a particular field, a different way of thinking, or ways to better their own company based on similar experiences. Allow us to share a few of our favorite data science case studies with you so you can see first hand how companies across a variety of industries leveraged big data to drive productivity, profits, and more.
  1. Gramener and Microsoft AI for Earth Help Nisqually River Foundation Augment Fish Identification by 73 Percent Accuracy Through Deep Learning AI Models
  2. How We Scaled Data Science To All Sides of Airbnb Over 5 Years of Hypergrowth
  3. Spotify’s “This Is” Playlists: The Ultimate Song Analysis For 50 Mainstream Artists
  4. A Leading Online Travel Agency Increases Revenues by 16 Percent with Actionable Analytics
  5. How Mint.com Grew from Zero to 1 Million Users
  6. Netflix: Using Big Data to Drive Big Engagement
2018-12-06 10:42:26+00:00 Read the full story. CloudQuant Thoughts… A very nice collection of Data Science Case Studies.   Python Data Visualization 2018: Moving Toward Convergence This post is the second in a three-part series on the current state of Python data visualization and the trends that emerged from SciPy 2018. In my previous post, I provided an overview of the myriad Python data visualization tools currently available, how they relate to each other, and their many differences. In this post we’ll take a look at an important theme that emerged from SciPy 2018: convergence, i.e., Python libraries becoming more similar in capability as they mature over time and share ideas and approaches. These trends of convergence have started to erase some of what were previous clear distinctions between each library. This is great for users, though it does make it more difficult to make blanket recommendations. As in the first post, we’ll generally separate the SciVis projects (typically 3D plotting situated in real-world space) from InfoVis projects (typically 2D plotting situated on the page or screen surface with arbitrary coordinate axes). 2018-12-03 16:57:44+00:00 Read the full story. CloudQuant Thoughts… As we put the finishing touches to CQ.AI we are considering adding some of these visualization tools. As I have said many times before, a good graphical representation can communicate a complicated idea in seconds.   Montreal startup Stradigi’s AI game teaches people sign language Responsibly applied artificial intelligence (AI) has the potential to solve some of the world’s toughest challenges. One needn’t look further for evidence then the winners of this week’s IBM Watson AI Xprize wildcard round, which included a Montreal startup — Aifred Health — developing a model that helps clinicians choose personalized patient treatment programs. In related news, just this past Sunday, Google subsidiary DeepMind unveiled AlphaFold, AI that can predict protein folding more accurately than any system before it. Accessibility is another burgeoning area of what’s been coined “AI for good” research, and one which Montreal startup Stradigi AI is committed to advancing with a new tool for the deaf and hearing impaired At the NeurIPS 2018 conference in Montreal this week, the four-year-old startup — which its two cofounders, Carolina Bessega and Jaime Camacaro, pivoted from software development to AI research in 2016 — demoed a game that uses computer vision to help people learn American Sign Language (ASL). 2018-12-08 00:00:00 Read the full story. How We Built the ASL Alphabet Game At Stradigi AI, our best ideas often stem from collaboration between our departments and the conception of our American Sign Language (ASL) Alphabet Game was no different. Developed as a demo for the NeurIPS conference, it took creativity, brain power, and meticulous coordination to bring this idea to life in a very short time. What we are presenting today is how our project came to life, and how this demo, rooted in our mission to promote AI for Good, advocates the idea that artificial intelligence really can make people’s lives better. Here’s how the ASL Alphabet Game came to be… Read the full story. CloudQuant Thoughts… A pity they have not made it publicly available yet, I guess they are working hard on a two handed version.   From seed to sip: How Anheuser-Busch InBev uses AI to drive growth Anheuser-Busch InBev (AB InBev) is using artificial intelligence to drive growth and innovation across all dimensions of its global brewing business. The brewer of Budweiser, Corona, Stella Artois and more than 500 other beer brands has built a worldwide analytics platform on the Microsoft Azure cloud, enabling the company to draw data driven-insights about everything from optimal barley growing conditions to drivers of successful sales promotions. Tassilo Festetics, AB InBev’s vice president for global solutions, shared insights about the company’s AI strategy at a recent AI in Business event in San Francisco, which Transform edited into an abbreviated Q&A. How is Anheuser-Busch InBev using AI today? 2018-12-05 16:56:22+00:00 Read the full story. CloudQuant Thoughts… I’ll be honest, I only really put this one here as a challenge to myself to come up with some pithy retort… “Making AI Weiser one Bud at a time. That sounds like my kind of data science job!”   AI Weekly: 8 takeaways from NeurIPS 2018 After a weeklong whirlwind of talks, demonstrations, spotlight sessions, and posters, the Conference on Neural Information Processing Systems (NeurIPS) — one of the largest artificial intelligence (AI) and machine learning conferences of the year — is coming to a close, and it was a smashing success by any measure. There were an estimated 8,300 registered attendees at the Palais des Congres de Montréal, where much of this week’s action took place. This year’s program featured 42 workshops and nine tutorials, and approximately 4,854 papers were submitted for consideration, 1,010 of which were accepted. That’s all despite a bit of a preconference kerfuffle that led to the NeurIPS board changing the conference’s acronym from “NIPS,” which some attendees and sponsors had protested for its potentially offensive connotations. So what were this year’s highlights? Well, Intel open-sourced a useful tool — HE-Transformer — that allows AI systems to operate on encrypted data. IBM detailed two breakthrough AI training techniques — a digital method that’s up to 4 times faster than the previous state of the art and an analog chip with phase-change memory — that both retain 8-bit precision. And Nvidia described a generative model that can create three-dimensional environments using real-world videos from sources like YouTube. Those only scratched the surface… 2018-12-07 00:00:00 Read the full story. CloudQuant Thoughts… So many of last weeks stories obviously came out of this event. It would probably be worth checking out this post in case you missed anything else.   AlphaFold @ CASP13: “What just happened?” I just came back from CASP13, the biennial assessment of protein structure prediction methods (I previously blogged about CASP10.) I participated in a panel on deep learning methods in protein structure prediction, as well as a predictor (more on that later.) If you keep tabs on science news, you may have heard that DeepMind’s debut went rather well. So well in fact that not only did they take first place, but put a comfortable distance between t… 2018-12-09 00:00:00 Read the full story. CloudQuant Thoughts… I read this three times and still didn’t understand it but it is obvious that DeepMind is distrupting yet another mystery filled big data sector.   Artificial intelligence ethics researchers call for facial recognition regulation A group of research including employees from Google and Microsoft have called for regulation of “oppressive” facial recognition technology. Concerns were raised in a report published by AI Now, a research group dedicated to monitoring the safe application of technology. It stated: “Facial recognition and affect recognition [such as detection for personality traits] need stringent regulation to protect the public interest. “Such regulation should include national laws that require strong oversight, clear limitations, and public transparency. Communities should have the right to reject the application of these technologies in both public and private contexts. “Mere public notice of their use is not sufficient, and there should be a high threshold for any consent, given the dangers of oppressive and continual mass surveillance.” South Wales Police, the Metropolitan Police in London and Leicestershire Police all use the technology but doubts have been cast over its reliability. A recent study found the systems, created by Japanese company NEC, struggled to identify suspects wearing hats and glasses. 2018-12-06 00:00:00 Read the full story. CloudQuant Thoughts… At this moment in time it seems like the best way democracy can distance itself from autocratic rule is by enshrining regulations to defend our individual data privacy.  
  What Great Data Analysts Do — and Why Every Organization Needs Them The top trophy hire in data science is elusive, and it’s no surprise: a “full-stack” data scientist has mastery of machine learning, statistics, and analytics. When teams can’t get their hands on a three-in-one polymath, they set their sights on luring the most impressive prize among the single-origin specialists. Which of those skills gets the pedestal? Today’s fashion in data science favors flashy sophistication with a dash of sci-fi, making AI and machine learning the darlings of the job market. Alternative challengers for the alpha spot come from statistics, thanks to a century-long reputation for rigor and mathematical superiority. What about analysts? If your primary skill is analytics (or data-mining or business intelligence), chances are that your self-confidence has taken a beating as machine learning and statistics have become prized within companies, the job market, and the media. But what the uninitiated rarely grasp is that the three professions under the data science umbrella are completely different from one another. They may use some of the same methods and equations, but that’s where the similarity ends. Far from being a lesser version of the other data science breeds, good analysts are a prerequisite for effectiveness in your data endeavors. It’s dangerous to have them quit on you, but that’s exactly what they’ll do if you under-appreciate them. 2018-12-04 22:35:01+00:00 Read the full story.   Here Are the Most in Demand Skills for Data Scientists I scoured job listing websites to find which skills are most in demand for data scientists. I looked at general data science skills and at specific languages and tools separately. I searched job listings on LinkedIn, Indeed, SimplyHired, Monster, and AngelList on October 10, 2018. I read through many job listings and surveys to find the most common skills. Terms like management were not compared because they can be used in so many different contexts in job listings. All searches were performed for the United States with “data scientist” “[keyword]”. Using exact match search reduced the number of results. However, this method ensured the results were relevant for data scientist positions and affected all search terms similarly. 2018-12-07 15:00:46+00:00 Read the full story.   Softbank prices its $23.5 billion IPO — one of the biggest of all time SoftBank priced its initial public offering Monday. The conglomerate is set to raise as much as 2.65 trillion Japanese yen ($23.5 billion) through the issuance of up to 1.76 billion shares at 1,500 yen ($13) apiece. That makes it the biggest-ever IPO in Japan and one of the largest of all time globally, just shy of Alibaba’s $25 billion US IPO. SoftBank will debut on the Tokyo Stock Exchange on December 19. 2018-12-10 00:00:00 Read the full story.   Hitachi Ups Game for Managing Unstructured Data Most enterprise data go unused, and according to some studies very little unstructured data in the form of text, audio and video makes its way into the hands of data analysts. Hence, data integrators see an opportunity to combine their tools with object storage and hybrid cloud deployments to expand the data analytics ecosystem currently drowning in unstructured data generated by social media and other sources. That’s the impetus behind the latest version of Hitachi’s year-old Vantara data integration unit, who’s Pentaho 8.2 release combines analytics software with its object storage platform to help data analysts get a handle on unstructured data. Hitachi Vantara based on Santa Clara, Calif., said the release also helps manage hybrid cloud deployments. 2018-12-04 00:00:00 Read the full story.   Amobee Collaborates with Oracle Data Cloud to Activate Third-Party Data Across Programmatic and Social Media Platforms Amobee, a global digital marketing technology company serving brands and agencies, has expanded its collaboration with Oracle Data Cloud to become one of the first companies to activate third-party data across programmatic and social media through its platform, providing marketers seamless activation across digital channels. Amobee unifies key programmatic channels—including major social media platforms, formats, and devices—to provide agencies and leading brands with advanced data management and media planning capabilities as well as actionable, real-time market research and proprietary audience data. The collaboration allows marketers to access offline purchase-based transaction datasets through Oracle Data Cloud and activate them across social media and other digital programmatic media channels through Amobee, streamlining their digital ad buy. 2018-12-05 00:00:00 Read the full story.   Melissa Launches Suite of AI Solutions that Can Help Transform Clinical Data Melissa, a provider of global contact data quality and identity verification solutions, is releasing a suite of advanced artificial intelligence (AI) solutions that combine machine reasoning, natural language processing, and machine learning. Melissa Informatics’ Sentient (MIS) solution is a new and unique set of clinical data quality and integration tools that turn diverse, dirty, and disconnected data into a clean, research-ready data resource. “Too often, clinical data is expensively gathered and under-valued,” said Bob Stanley, senior director, customer projects, Melissa Informatics. “When you apply machine learning and machine reasoning to access, curate, and integrate this data, it becomes ready for rewarding new uses in patient care, precision medicine research, intellectual property, and unexpected new revenue.” 2018-12-04 00:00:00 Read the full story.   Buy Side Dips Toes in Alt Data Challenges for the buy side were the focus of the Friday morning keynote at the WBR Equities Leaders Summit. The broad question up for discussion was how will traditionally active managers need to adapt their investment strategies as automated technologies continue to play a larger role in the trading life cycle. The conversation focused on alternative, or unstructured data. While the topic was narrow, the discussion proved to be a broad window into the institutional buy side’s approach to new methods and technologies. Large money managers are known for conservatism and not being first movers, and their gradual movement into alt data is representative of how this in their DNA. 2018-12-07 16:49:46+00:00 Read the full story.   Here Are Ways AI is Helping Financial Institutions Artificial intelligence (AI) is disrupting diverse industries, but banking is projected to benefit the most out of incorporating AI systems in the next couple of years. Analysts estimate that AI will save the banking industry more than $1 trillion by 2030. I have been talking with bank executives for the last couple of years and it is exciting to hear that the banking industry has started to seriously consider artificial intelligence-based solutions for many traditional banking problems. The use cases where executives are seeing value do vary based on size, location and the type of financial institution. However, some core attributes remain the same. For example, large banks have a huge customer success burden, so they naturally look toward automation of customer service with chatbots. Financial institutions like hedge funds are chasing alpha with AI on top of new layers of data sources, and insurance companies are improving risk models with AI. On the other hand, many of the financial institutions in developing countries are still stuck on setting up data infrastructure in a way that allows them to leverage AI. Here are a few problems and AI solutions that many financial institutions are actively pursuing to create value. Of course, this is not a comprehensive list of all the AI initiatives the finance industry is experimenting with, but I would say these are some of the most popular trends. 2018-12-07 15:20:52+00:00 Read the full story.   Gregory Piatetsky-Shapiro Shares His Insights on Data Science and Machine Learning Gregory Piatetsky-Shapiro, Ph.D., is a well known data scientist, founder of KDnuggets and co-founder of the KDD conferences. He constantly features as top influencer (LinkedIn Top Voice, 2018) in knowledge discovery, data mining and data science domains. Also, Gregory has produced over 60 publications and edited several books and collections associated with data mining and knowledge discovery. Recently, we got in touch with him to have a quick conversation about the current and future state of data science, its implication on business growth and key challenges. We enjoyed taking this interview, and we’re certain that our readers are going to love the insights shared by Gregory. 2018-12-07 15:05:46+00:00 Read the full story.   Anticipated Trends of Artificial Intelligence for 2019 Artificial intelligence has been shaping business performance through enhanced planning, improved advertisements, risk management, sensible analysis and faster delivery. Machine Learning can solve critical and complicated problems in a jiffy using techniques like Problem-Solving, Speed Recognition, Natural Language Processing (NLP), Image recognition etc. It is no wonder that machines are anticipated to take over human intelligence in near future. When used at a global level for growing businesses, AI can bring monumental advancement in the long run, by complying with every necessity that mankind requires. Improvement of business performances through AI will not only increase individual profits of organizations but will also accelerate global economic growth as a whole. IDC had predicted that Artificial Intelligence is going to support 40 percent of digital transformation and 100 percent of IoT According to Gartner Predictions, by 2020, 85 percent of enterprise relationship will be managed in customer services without human interactions. Another AI prediction by Servion Global Solutions state that by 2025, AI is anticipated to influence 95 percent of customer service interactions, including live conversations through telephone. 2018-12-07 00:00:00 Read the full story.   The Year Ahead: Data Will Drive the Enterprise in 2019 The year just ending has been an interesting one for data managers. Artificial intelligence (AI) and machine learning took center stage, which also meant an increasingly glaring spotlight on data sourcing, management, and viability. The continued rise of the Internet of Things (IoT) also meant no letting up on demands for data environments to deliver requirements fast and furiously. The year ahead will bring more of the same—as well as a continuation of the transformation of information management. Here are some of the changes and challenges on the horizon for 2019, as seen by leading industry participants and observers. 2018-12-04 00:00:00 Read the full story.   Physics-guided Neural Networks (PGNNs) – Towards Data Science Physics-based models are at the heart of today’s technology and science. Over recent years, data-driven models started providing an alternative approach and outperformed physics-driven models in many tasks. Even so, they are data hungry, their inferences could be hard to explain and generalization remains to be a challenge. Combining data and physics could reveal the best of both worlds. When machine learning algorithms are learning, they are actually searching for a solution in the hypothesis space you defined by your choice of algorithm, architecture, and configuration. Hypothesis space could be quite large even for a fairly simple algorithm. Data is the only guide we use to look for a solution in this huge space. What if we can use our knowledge of the world — for example, physics— together with data to guide this search? This is what Karpatne et al. explained in their paper Physics-guided Neural Networks (PGNN): An Application in Lake Temperature Modeling. In this post, I will explain why this idea is crucial and I will also describe how they did it by summarizing the paper. 2018-12-10 00:47:36.348000+00:00 Read the full story.   2019 Predictions About Artificial Intelligence That Will Make Your Head Spin While the hip, ubiquitous business buzzwords are cryptocurrency and blockchain, the truly formidable factor of what is being called the fourth industrial revolution is Artificial Intelligence. Whether praised as a panacea for greater business efficiency or the feared as the demise of humanity,  Artificial Intelligence is upon us and will impact business and society at large in ways that we can only begin to imagine. Fasten your seatbelts. Here’s what a few influencers in the arena say is on tap for 2019. First, Ibrahim Haddad, Director of Research at The Linux Foundation says that there are two key areas to watch. “2019 is going to be the year of open source AI,” predicts Haddad. “We’re already seeing companies begin to open source their internal AI projects and stacks, and I expect to see this accelerate in the coming year.” He says that the reason for such a move is that it increases innovation, enables faster time-to-market and lower costs. “The cost of building a platform is high, and organizations are realizing the real value is in the models, training data and applications. We’re going to see harmonization around a set of critical projects creating a comprehensive open source stack for AI, machine learning and deep learning.”2018-12-06 10:32:26+00:00 Read the full story.   Building a Successful Data Governance Strategy One of the core elements of data analytics that organizations struggle with today is data governance. An organization could do everything right and still wonder why their analytics projects are failing if they haven’t taken the time to build and implement a governance strategy. Here are some (hopefully) helpful tips from experts for building a data governance framework that lasts. There are many aspects to getting data governance right, including having the right people in place. Large enterprises are increasingly hiring chief data officers (CDOs) to oversee data governance, particularly when regulations like the European Union’s General Data Protection Regulation (GDPR) are involved. While the CDO title is rarer at smaller firms, there are still people with the same responsibility as a CDO – and at the top of the list it’s making sure the data is governed. Beyond the people, it’s important to consider the core data governance processes that are critical for success. According to Vimal Vel, who’s the vice president and global head of master data solutions at data solutions provider Dun & Bradstreet, one of the first steps in building a successful data governance practice is understanding the business drivers and business outcomes that matter to the organization. “A lot of time we notice customers dive into data strategy or data governance straight away,” he says. “It’s important to take the time to understand what kinds of business outcomes you’re looking to drive, and what is the culture of your organization, both locally and globally.” Much of the information that’s generated, used, and managed in a data analytics project is tied directly to those business outcome measurements, Vel says. “So recognizing what your business outcomes are, and then organizing your data strategy around that, is step one,” he says. 2018-12-07 00:00:00 Read the full story.   Achieving Data Governance 2.0 with Knowledge Graph Technology When you use Google, pick a movie from Netflix, talk to Siri or Alexa, or look for your nephew on Facebook, you’re benefiting from Knowledge Graph technology. DATAVERSITY® recently caught up with the three co-founders of TopQuadrant, CEO Irene Polikoff, CTO Ralph Hodgson, and CMO Robert Coyne, to get their perspective on Knowledge Graph technology and how it fits with today’s Data Governance needs. 2018-12-05 00:35:06-08:00 Read the full story.   AI in 2019: 8 Trends to Watch Forget the job-stealing robot predictions. Let’s focus on artificial intelligence trends – around talent, security, data analytics, and more – that will matter to IT leaders. December means holiday parties, New Year’s resolutions, and a blizzard of technology industry predictions. You’ll see a ton of AI-related calls as we approach 2019. The artificial intelligence hype machine is already roaring. The potential impacts of AI are wide-ranging – as are the related forecasts, on everything from how AI will change college admissions to the role it will play in international relations and politics. We decided to focus on the trends that matter most urgently to IT leaders – you don’t need another “AI is taking over the world” story; you need concrete insights for your team and business. So let’s dig into the key trends in AI – as well as overlapping fields such as machine learning – that IT leaders should keep tabs on in 2019. 2018-12-04 12:36:50+00:00 Read the full story.   How AI is Increasing DBAs’ Strategic Value Increasingly, DBAs are seeing artificial intelligence (AI), and machine learning applied to database management and optimization, taking self-healing and self-tuning to the next level. These solutions, from both database and third-party vendors, allow DBAs to spend less time searching for bottlenecks, and more time doing more productive and creative work in support of strategic business goals. AI and machine learning have entered the mainstream in the last couple of years.  Here are some brief descriptions to help understand this landscape before exploring how these technologies can benefit DBAs:
  • AI covers anything where a machine imitates certain “cognitive” human functions such as learning and problem solving. Examples include automatic trading systems, autonomous cars and intelligent routing, and delivery systems.
  • Machine Learning is a subset of AI that uses statistical techniques to allow computers to model and predict outcomes using datasets. Examples include email filtering, fraud detection, and ranking systems to drive online marketing.
  • Deep Learning is a specific class of machine learning that uses artificial neural networks, as opposed to machine learning task-oriented algorithms. Examples include computer vision, speech recognition, and natural language processing.
2018-12-04 00:00:00 Read the full story.   Google fixes ‘sexist’ Translate tool Google has altered its Translate tool after it was accused of sexism for defaulting translations to the masculine pronoun. If someone tries to translate a from English into French, Portuguese or Spanish, it will now provide a feminine alternative. Executives told software engineers to go back to the drawing board and re-train the artificial intelligence after Stanford University computer scientists spotted Google translating news articles written in Spanish and changing sentences about women talking to “he said”. It also translating words like “nurse” into the feminine form, and “doctor” into the masculine. In many languages, words that refer to people need a gender. To translate these, Google’s algorithm had to choose either a masculine or female form. But the algorithm began to replicate existing stereotypes ingrained in the millions of translations it trawls and learns from on the web. 2018-12-06 00:00:00 Read the full story.   Yann LeCun Says Facebook Is ‘Dust’ Without Deep Learning, And No One Is Disagreeing One of the foremost experts on Deep Learning, Facebook AI supremo and Chief AI Scientist, FAIR, Yann LeCun made another key announcement regarding deep learning. The French AI expert who played a pivotal role in setting up Facebook’s lab in Paris shared in an interview to CNN, “If you take the deep learning out of Facebook today, Facebook’s dust. It’s entirely built around it now.” The statement typifies the current stand on technology by Mark Zuckerberg led Menlo Park giant which has been embattled over its role in US elections. Even though statement has been largely dubbed controversial, LeCun summed up Facebook’s transformation around AI and ML succinctly. Over the last three to four years, Facebook has been slowly but surely transforming its business around intelligent systems technology. The change can be seen in features such as posts, translations and newsfeed algorithms which is the core of the social network platform. Facebook applied deep learning to combat hate speech and misinformation in countries like Myanmar. The social media giant was criticised when the platform was said to have fueled ethnic violence against the Rohingya population. The company also said AI application created by Facebook is now capable of flagging 52 percent of all content it gets rid of in Myanmar before it is reported by users. 2018-12-10 11:46:48+00:00 Read the full story.   Trend-Setting Products in Data and Information Management for 2019 You can call it the new oil, or even the new electricity, but however it is described, it’s clear that data is now recognized as an essential fuel flowing through organizations and enabling never before seen opportunities. However, data cannot simply be collected; it must be handled with care in order to fulfill the promise of faster, smarter decision making. More than ever, it is critical to have the right tools for the job. Leading IT vendors are coming forward to help customers address the data-driven possibilities by improving self-service access, real-time insights, governance and security, collaboration, high availability, and more. To help showcase these innovative products and services each year, Database Trends and Applications magazine looks for offerings that promise to help organizations derive greater benefit from their data, make decisions faster, and work smarter and more securely. 2018-12-05 00:00:00 Read the full story.   Immuta Accelerates Enterprise Cloud Data Science Adoption, Reduces Risk “Immuta, the leading provider of enterprise data management solutions for artificial intelligence (AI), today unveiled new features that can dramatically reduce the cost and risk of running data science programs in the cloud. The company also announced the creation of a new business unit dedicated to building managed cloud-services for its customers. According to 451 Research’s Voice of the Enterprise AI/ML survey¹, over 50% of all organizations developing or deploying machine learning software highlighted public cloud as the favored development environment. However, as organizations move data science programs to the cloud, they often struggle to comply with stringent data privacy requirements, such as the EU General Data Protection Regulation (GDPR) and the Health Insurance Portability and Accountability Act (HIPAA), and therefore aren’t able to fully capitalize on the benefits of cloud economics. Also, enterprises must create multiple copies of the data per policy, which in turn, also requires the creation of a separate Amazon Elastic MapReduce (EMR) cluster per user permission role – which dramatically increases cost and operational complexity.” 2018-12-05 00:15:20-08:00 Read the full story.   Insurers identifying AI applications “across the value chain” The insurance market is fully embracing artificial intelligence (AI) in a bid to drive cost efficiencies, according to Craig Beattie, senior analyst at Celent, the research and advisory firm. There are three key factors leading insurers to utilise AI, said Beattie, at FinTech Connect: “An increase in processing power, an increase in data availability – not just in surfacing data within the insurer, but also in absorbing external data, and also algorithm improvements. “These days these things are much more efficient and we’re able to rely on much better hardware as well,” he said. 2018-12-05 00:00:00 Read the full story.   No Time Like the Present for AI: The Journey to a Successful Deployment Today, over a decade since MapReduce technologies reshaped the field of data science, we find ourselves seeking more insight from even more data, ever more quickly. The confluence of improvements in computer processing power, networking performance, and storage platforms has enabled a massive leap forward in the next phase of data insight; machine learning (ML), deep learning (DL) and artificial intelligence (AI). While ML can improve the breadth, depth, and speed of insight that one can extract from one’s data, making the decision to adopt ML techniques is just the first step on a path that is arguably still being paved. At its core, AI is about data, so you can’t realistically begin an AI deployment journey unless you have well-curated data sets to feed the system. In most cases, data for AI is often derived from years of accumulation or from massive amounts of newly created ephemeral data. In other cases, massive data sets are being created and analyzed in real time. Regardless of source, all must be curated, as this is a critical precursor to successfully leveraging AI/ML/DL technologies. It’s a daunting task, sure, but the old adage ‘junk in, junk out’ is true, particularly with AI. The value of what you get out of this level of analysis is directly correlated to the quality of the input data. The slightest corruption of input data can be amplified through an AI system in ways that you cannot correct later, so getting it right from the beginning is critical. 2018-12-04 00:00:00 Read the full story.   Gartner Identifies Top 10 Trends for Infrastructure, Operations in 2019 As is its convention each year at this time, IT industry researcher and analyst Gartner has highlighted the key technologies and trends for which IT infrastructure and operations decision-makers must prepare in order to fully support new-gen digital infrastructure in 2019. Gartner analysts on Dec. 4 presented these findings during the Gartner IT Infrastructure, Operations and Cloud Strategies Conference, which continues through Dec. 6 in Las Vegas. “The focus of I&O leaders is no longer to solely deliver engineering and operations, but instead to deliver products and services that support and enable an organization’s business strategy,” Gartner Senior Director and Analyst Ross Winser said. “The question is already becoming: ‘How can we use capabilities like artificial intelligence (AI), network automation or edge computing to support rapidly growing infrastructures and accomplish business needs?’” During his presentation, Winser encouraged I&O leaders to prepare for the impacts of 10 key technologies and trends to support digital infrastructure in 2019. 2018-12-04 00:00:00 Read the full story.   5 Fundamental Theorems Of Machine Learning The last century has seen tremendous innovation in the field of mathematics. New theories have been postulated and traditional theorems have been made robust by persistent mathematicians. And we are still reaping the benefits of their exhaustive endeavours to build intelligent machines. Here is a list of five theorems which act as a cornerstone for standard machine learning models:
  • The Gauss-Markov Theorem
  • Universal Approximation theorem
  • Singular Value Decomposition
  • Mercer’s Theorem
  • Representer Theorem
2018-12-04 07:31:18+00:00 Read the full story.   Loads of Data and AI Announcements at Ignite 2018 The 2018 Microsoft Ignite conference was overflowing with attendees this year, as user enthusiasm continues to grow with the advent of CEO Satya Nadella. Since space is short and the announcements are many, let’s get straight to the details of the product innovations available for data professions. 2018-12-04 00:00:00 Read the full story.   Cloudera Gives a Peek at Future ML Platform Cloudera continued its evolution away from Hadoop today by announcing a technical preview for Cloudera Machine Learning, its new data science and data engineering platform that’s based on Kubernetes, which enables it to run in the cloud and on premise. Hadoop’s influence has waned, in many respects, in direct proportion to the rise of public cloud platforms. Instead of taking the time to build and manage Hadoop clusters to store big data and run analytics on them, companies are turning to cloud providers like Amazon Web Services, who can offer cheap object storage and scalable compute resources. This changing market dynamic has helped to drive 46% year-over-year revenue growth for AWS, and Microsoft Azure and Google Compute Platform are growing even faster. In many ways, the October merger of Cloudera and Hortonworks was a response to this dynamic. But Cloudera thinks it has an advantage that the cloud vendors can’t touch: the capability to support multi-cloud and on-premise computing in a hybrid manner. That’s the background behind today’s announcement of Cloudera Machine Learning, which will combine data engineering and data science capabilities in a cloud-friendly package. As the company points out, the new offering will work “on any data, anywhere.” 2018-12-05 00:00:00 Read the full story.   Why Companies That Wait to Adopt AI May Never Catch Up While some companies — most large banks, Ford and GM, Pfizer, and virtually all tech firms — are aggressively adopting artificial intelligence, many are not. Instead they are waiting for the technology to mature and for expertise in AI to become more widely available. They are planning to be “fast followers” — a strategy that has worked with most information technologies. We think this is a bad idea. It’s true that some technologies need further development, but some (like traditional machine learning) are quite mature and have been available in some form for decades. Even more recent technologies like deep learning are based on research that took place in the 1980s. New research is being conducted all the time, but the mathematical and statistical foundations of current AI are well established. 2018-12-06 13:05:30+00:00 Read the full story.   6 Step Plan to Starting Your Data Science Career When people want to launch data science careers but haven’t made the first move, they’re in a scenario that’s understandably daunting and full of uncertainty. However, when they follow mapped-out processes that help them get into the field, success becomes easier to visualize and achieve. Here are six steps to get started… 2018-12-06 00:00:00 Read the full story.   A.I. as Talent Scout: Unorthodox Hires, and Maybe Lower Pay One day this fall, Ashutosh Garg, the chief executive of a recruiting service called Eightfold.ai, turned up a résumé that piqued his interest. It belonged to a prospective data scientist, someone who unearths patterns in data to help businesses make decisions, like how to target ads. But curiously, the résumé featured the term “data science” nowhere. Instead, the résumé belonged to an analyst at Barclays who had done graduate work in physics at the University of California, Los Angeles. Though his profile on the social network LinkedIn indicated that he had never worked as a data scientist, Eightfold’s software flagged him as a good fit. He was similar in certain key ways, like his math and computer chops, to four actual data scientists whom Mr. Garg had instructed the software to consider as a model. The idea is not to focus on job titles, but “what skills they have,” Mr. Garg said. “You’re really looking for people who have not done it, but can do it.” 2018-12-06 00:00:00 Read the full story.   Wealthfront Unleashes Free Financial Planning via App The service uses artificial intelligence techniques like machine learning to develop its advice. Wealthfront, the second largest independent robo-advisor, has launched a free financial planning service available to anyone who downloads the Wealthfront app. The app uses the same underlying financial advice engine available to current Wealthfront clients, called Path, which can aggregate data from all financial accounts belonging to an individual, then analyze the data to advise on goals such as saving for a home, college or retirement. “We realized we needed to get our financial planning service into more hands after we found engagement with Path is directly correlated to clients saving more of their income,” said Dan Carroll, co-founder of Wealthfront, in a statement. “Wealthfront clients who use Path are saving four times more than the average American every year.” 2018-12-04 00:00:00 Read the full story.   How does predictive pricing change retail pricing optimization? hen was predictive pricing born? Reportedly, at Japanese rice exchanges in the 17th century, which used rice stock analysis to describe the behavior of the stock market and make predictions about future trends and prices. Today, four hundred years later, retailers leverage the power of algorithms to help them set optimal prices and earn more. What major changes have machines brought to retail? There was something in-between the methods the Japanese used to predict prices and modern machine-based solutions which make pricing recommendations. It was the first machine learning algorithm, which was invented by Sir Francis Galton, an English statistician, in the 19th century. Called linear regression, the algorithm could establish a one-way dependency of one variable from another and predict one of them through the other one. Since then, humans have been constantly upgrading their computational capacity: they have switched from an abacus to the simplest computing devices, and then to more sophisticated machines. Simultaneously, they have been accumulating large data sets: the data then adopted a structure and relationships between data points. The more sophisticated data has called for more complicated algorithms. 2018-12-04 13:11:34+00:00 Read the full story.   Do Reinforcement Learning (RL) Algorithms Need New Simulators To Solve Real-World Problems? The current benchmarks used for reinforcement learning tasks are in many ways underperforming in absorbing real-world data. The testing of reinforcement learning algorithms in low complexity simulation environments ends up with algorithms that are very specific. The families of benchmark RL domains should contain some of the complexity of the natural world, while still supporting fast and extensive data acquisition. The domains should also permit a test of generalisation through fair train/test separation, easy comparison and replication of results. Results showing poor adaptability of RL algorithms to a problem suggests that either the algorithm itself is inefficient or it is the simulators not being diverse enough to induce interesting learned behaviours, and there are very few who concentrate on the latter. What we need is a new class of Reinforcement Learning simulators that incorporate signals or data from the real world as part of the state space. 2018-12-07 10:52:59+00:00 Read the full story.   Deep Transfer Learning for Natural Language Processing — Text Classification with Universal… Transfer learning is an exciting concept where we try to leverage prior knowledge from one domain and task into a different domain and task. The inspiration comes from us humans itself, where in, we have an inherent ability to not learn everything from scratch. We transfer and leverage our knowledge from what we have learnt in the past for tackling a wide variety of tasks. With computer vision, we have excellent big datasets available to us, like Imagenet, on which we get a suite of world-class, state-of-the-art pre-trained models to leverage transfer learning. But what about Natural Language Processing? There-in lies an inherent challenge considering text data is so diverse, noisy and unstructured. We’ve had some recent successes with word embeddings including methods like Word2Vec, GloVe and FastText, all of which I have covered in my article around ‘Feature Engineering for Text Data’. In this article, we will be showcasing several state-of-the-art generic sentence embedding encoders, which tend to give surprisingly good performance especially on small amounts of data for transfer learning tasks as compared to word embedding models. We will be covering the following models:
  • Baseline Averaged Sentence Embeddings
  • Doc2Vec
  • Neural-Net Language Models (Hands-on Demo!)
  • Skip-Thought Vectors
  • Quick-Thought Vectors
  • InferSent
  • Universal Sentence Encoder
We will try to cover essential concepts and also showcase some hands-on examples leveraging Python and Tensorflow, in a text classification problem focused on sentiment analysis! 2018-12-04 16:56:58.945000+00:00 Read the full story.   ACI Worldwide teams with BioCatch on behavioural biometrics ACI Worldwide (NASDAQ: ACIW), a leading global provider of real-time electronic payment and banking solutions, today announced a collaboration with BioCatch, the global leader in behavioural biometrics, to protect customers from online and mobile banking fraud such as account takeover. In addition to machine learning and advanced analytics capabilities from ACI’s UP Payments Risk Management solution, banks around the world will now benefit from BioCatch’s real-time behavioural assessments to identify a wide range of cyber threats without disrupting the user experience. 2018-12-10 13:52:00 Read the full story.   Infosys Research: Data Analytics Most Instrumental in Lowering Risk for Organizations “Infosys, a global leader in next-generation digital services and consulting, today published a global research on data analytics from the Infosys Knowledge Institute. The survey titled, ‘Endless possibilities with data: Navigate from now to your next’, reveals that a majority of organizations are deploying analytics to enhance customer experiences and mitigate risk. This research tries to understand how data analytics is becoming core to driving digital transformation for enterprises and makes an assessment of enterprise expectations in a world of endless possibilities with data. It also explores a range of challenges, opportunities, and the role of new technologies in the analytics world.” 2018-12-06 00:10:06-08:00 Read the full story.   Creating a National Data Privacy Law The United States lacks a cohesive data privacy law. Presently U.S. law is a combination of federal sectoral laws and state laws. With the growing interest from consumers, tech companies, media, and politicians, there may finally be enough momentum to pass a federal law. This article will discuss the pros and cons of a national data privacy law, the challenges to creating a national privacy law, and how it would greatly benefit businesses. 2018-12-06 00:00:00 Read the full story.   Data tends to lie a lot. But we can circumvent this Consumer behaviour cannot be so easily predicted from social media reactions like Facebook Likes and Twitter shares. There’s no real way to know. There’s a lot of noise out there. There’s a lot of data available, but it’s not powerful in prediction. “Let me give you an example. Let’s say I post something online about a Ferrari and you “Like” it. Are you liking Ferrari or are you liking your post? There’s no real way to know. There’s a lot of noise out there. There’s a lot of data available, but it’s not powerful in prediction, said Suresh Shankar, founder of Crayon Data, in an interview given to Kristie Neo of Deal Street Asia. Founded in 2012, the Singapore-based Crayon Data has an AI algorithm which analyses consumer data to acquire customers for clients like Emirates Airlines, regional banks, and hotels. Crayon claims its algorithm is smart enough even to predict offline behaviour, a crucial aspect for Southeast Asia where a majority of transactions still take place off the grid. 2018-12-07 08:23:07+00:00 Read the full story.   Discover The Power of Big Data And Learning Analytics For Education Big data and learning analytics have immense power to transform our world, but what does that power really mean on a day to day basis? Let’s take a look. Data-driven decision making is all about making decisions that are backed up by solid, verifiable data and not observations. Strategic decision making, which was popularized in the 1980s and 1990s, has transformed into a sophisticated concept. It bears the name of big data. With the help of advanced analytic techniques, it is possible to take a close look at large and varied data sets so as to uncover hidden patterns or correlations. At present, big data and analytics are used for instructional applications in the context of higher education. Due to the fact that there is not sufficient evidence regarding the way in which the investment can pay off, big data and learning analytics have not yet been adopted. Collecting and analyzing large chunks of data is one of the best approaches to improving the learning process. The problem is that educational sectors do not take advantage of the opportunities to invest in big data analytics and language processing. Basically, they do not do what it takes to improve their competitiveness and productivity. 2018-12-06 21:06:24+00:00 Read the full story.   How Farmer’s Fridge is revolutionizing breakfast, lunch and dinner with tech and data Five years ago, Farmer’s Fridge took the concept of the vending machine and turned it on its head — instead of potato chips and chocolate bars, healthy, seasonal and fresh food would be available to customers on demand. Undoubtedly, tech has played a starring role in getting operable fridges at more than 200 locations across the Midwest, as the internet of things between the company, their fridges and consumers waiting to grab breakfast, lunch or dinner continues to become more efficient each day.  We spoke with three of the company’s engineers on how they’re improving the internet of things as Farmer’s Fridge scales. 2018-12-06 00:00:00 Read the full story.   TigerGraph Unveils Latest Cloud Platform with Starter Kits for AI and Machine Learning TigerGraph, a graph analytics platform for the enterprise, is introducing TigerGraph Cloud, a robust way to run scalable graph analytics in the cloud. Users can get their TigerGraph service up and running, tapping into TigerGraph’s library of customizable graph algorithms to support key use cases including AI and machine learning. “Not only are we providing this as a service but we are providing a starter kit that a user can go in, select a particular deployment pattern, and use some of the prebuilt algorithms so they can do deeper analytics on their data much more quickly,” said Todd Blaschka, COO, TigerGraph. TigerGraph Cloud provides data scientists, business analysts, and developers with a cloud-based service for applying SQL-like queries for faster and deeper insights into data. 2018-12-04 00:00:00 Read the full story.   Google Uses Machine Learning To Combat Low Quality Link Spam Google has implemented machine learning to help combat low quality link spam, and there are a lot of lessons that can be learned from Google’s methods. Google is one of the companies that has made a name for itself by using big data. The company wasfounded on the PageRank algorithm, which collected data on all of the links that websites had received across the Internet and incorporated that into its ranking algorithm. Although the algorithm was a huge improvement over previous search engines, it was still an inadequate solution. Google needed to use more sophisticated big data technology to improve their SERPs and avoid low quality link spam. Machine learning has played an important role in rectifying the challenges that Google faced, as it strived to bring the highest quality search results to its users. They used TensorFlow, an open source algorithm that uses machine learning to help rank webpages for various search phrases. Here are some benefits of machine learning approaches… 2018-12-05 17:21:29+00:00 Read the full story.   Intel & Brazilian Robotics Company Hoobox Build World’s First AI Wheelchair In an attempt to leverage artificial intelligence for good, Brazilian robotics company Hoobox Robotics released a Wheelie 7 kit powered by Intel. The Wheelie 7, a motorized wheelchair allows people with greater mobility by helping them control it with simple facial expressions.  The AI-powered wheelchair uses AI and a camera, without invasive body sensors, providing users with independence and control over their location. As per the company statement, there are more than 60 people in US, testing the Wheelie 7 – most of whom are quadriplegics, people with amyotrophic lateral sclerosis or senior citizens. Anna Bethke, leader of AI for Social Good, Intel remarked, “Today on International Day of Persons with Disabilities, it’s important to recognize the ways technology can help people regain mobility and control of their lives. The Wheelie 7 kit from HOOBOX Robotics is a great example of using AI to enable people with limited mobility to move around using natural facial movements.” Dr Paulo Pinheiro, co-founder and CEO of HOOBOX Robotics claims that the Wheelie 7 is the first product to use facial expressions to control a wheelchair. This requires incredible precision and accuracy, and it would not be possible without Intel technology,” said Dr. Paulo Pinheiro, co-founder and CEO of HOOBOX Robotics. 2018-12-05 13:04:23+00:00 Read the full story.   5 Important AI Predictions For 2019 Everyone Should Read Artificial Intelligence – specifically machine learning and deep learning – was everywhere in 2018 and don’t expect the hype to die down over the next 12 months. The hype will die eventually of course, and AI will become another consistent thread in the tapestry of our lives, just like the internet, electricity, and combustion did in days of yore. But for at least the next year, and probably longer, expect astonishing breakthroughs as well as continued excitement and hyperbole from commentators. 2018-12-07 15:30:24+00:00 Read the full story.   Startup Proven Beauty says A.I. is key to having flawless skin The U.S. facial skincare market is worth an estimated $7 billion and growing rapidly, according to research firm Mintel. Yet despite the wealth of products and information available promising flawless, ageless skin, consumers are frustrated by the low success rate. Start-up Proven Skincare believes that personalized products have a much higher efficacy level. To determine the best ingredients for each individual, Proven is relying on AI, aggregating data from 8 million consumer reviews, 100,000 skincare products, 20,000 ingredients and 4,000 academic journals. Every year, consumers spend billions on skincare products that promise to reduce lines instantly, fade brown spots, improve firmness and elasticity and more. Yet despite the wealth of products available today, consumers continue to be frustrated by manufacturers’ inflated claims of smooth, silky, younger-looking skin. Now beauty start-up Proven Skincare — launched in November by Harvard Business School grad Ming Zhao and computational physicist Amy Yuan — is aiming to curb all that frustration, rejecting the traditional one-size-fits-all miracle remedy and instead relying on artificial intelligence to develop data-driven skincare routines that are completely personalized and sent straight to your doorstep. 2018-12-07 00:00:00 Read the full story.   The Top 5 Game Changing Uses For The Internet of Things These five fantastic uses for the internet of things can change the game in individual businesses and the way we live our daily lives. Not long ago, people were scratching their heads and asking, “What is the internet of things and why should I care?” They’re still scratching their heads, but it’s not for lack of proliferation. The IoT is all over the place, and there are all kinds of uses for the internet of things. There are about7 billion active IoT devices right now, and that number will only grow. People now know what the IoT is, and they’re wondering if they really need a connected, smart toothbrush, washing machine, refrigerator, or thermostat. They’re also wondering about security as billions of disparate devices flood the market. Despite the inherent skepticism bound to the advent of any new technological advancement, there really are some great uses for the IoT. Read on to find out what they are. 2018-12-06 09:30:34+00:00 Read the full story.   AI Charts Its Course in Aviation, Moving More Into Cockpit Ask anyone what they think of when the words “artificial intelligence” and aviation are combined, and it’s likely the first things they’ll mention are drones. But autonomous aircraft are only a fraction of the impact that advances in machine learning and other artificial intelligence (AI) technologies will have in aviation—the technologies’ reach could encompass nearly every aspect of the industry. Aircraft manufacturers and airlines are investing significant resources in AI technologies in applications that span from the flightdeck to the customer’s experience. Automated systems have been part of commercial aviation for years. Thanks to the adoption of “fly-by-wire” controls and automated flight systems, machine learning and AI technology are moving into a crew-member role in the cockpit. Rather than simply reducing the workload on pilots, these systems are on the verge of becoming what amounts to another co-pilot. 2018-12-07 15:10:18+00:00 Read the full story.   Before Starting, Consider 5 Reasons Your Big Data Project Will Fail There are all kinds of reasons your big data project will fail, but instead of seeing that is discouraging, use these lessons to fuel your success. Whether companies are finding revolutionary ways to use data for personalized recommendations, like Netflix, or getting in trouble for illegally targeting users, like Facebook, big data is an integral element in optimizing efficiency. However, if your company is ready to give their customers a better user experience while shopping for the besthomeowners insurance, or if you’re trying to save money in customer clothing returns, diving into a big data project isn’t going to be your saving grace. In fact, according to Nick Heudecker, an Gartner analyst, 85 percent of all big data projects fail. These failures come from the lack of understanding in how to run a big data project, and in an interview with interview with TechRepublic, Heudecker sites company adoption as the sore spot, not the technology itself. If your company is itching to use data in their business practice and wants the project to succeed, there are simple steps to help you stay out of the 85 percent statistic. We’re detailing the 5 mistakes you are likely to make during your big data project that would make it fail in hopes that you can avoid these problems before they arise. 2018-12-06 10:30:34+00:00 Read the full story.   Top 9 Digital Consumer Trends for Financial Marketers in 2019 Forecasting consumer trends is an inexact science at best. But financial marketers must stay ahead of major trends that impact purchase decisions. Such evolutions can’t be ignored – especially at a time of increasingly rapid change. These digital consumer trends are already taking root and will grow in importance for marketers and advertisers in 2019. The amount of information consumers can access is incredible, but the amount of data each human produces is even more startling. Over the last two years alone, 90% of the data in the world was generated. This data is created through searches, social media interactions, communication (email, texts, etc.), digital photos, digital services (Weather Channel, Uber, etc.), and the Internet of Things (voice devices, etc.). By 2020, it’s estimated that for every person on earth, 1.7 MB of data will be created … every second.. Consumers can access information within seconds on an increasing array of devices. As a result, consumer trends are being influenced by people’s ever-growing relationship with their smartphones and the internet, and the impact these devices and data have on their lives. Consumers are aware of this data overload and the pace of change that has occurred. In response, consumers are looking for ways to sort and consume the inputs (and outputs) that are most likely to impact their daily lives. 2018-12-10 05:05:36+00:00 Read the full story.   How this UW grad student, researching quantum computing, proved that classical computers are better than we thought A lot of great discoveries were made while looking for something else. For University of Washington computer science grad student Ewin Tang, research into quantum computing showed that our regular old computers might be capable of much more than we once thought. Tang’s discovery of a powerful new machine-learning algorithm for classical computers upended assumptions about computing challenges that were thought to require quantum computers. That discovery, made while Tang was studying machine-learning algorithms and quantum computing as an undergraduate at the University of Texas, has enormous implications for both of those fields. Now enrolled in the UW’s Paul G. Allen School of Computing Science and Engineering as a graduate student at the age of just 18, Tang is continuing to research how quantum computing will impact machine learning. Just last week, two other papers proving her breakthrough result will work with other types of machine learning were released. “We ended up getting this result in quantum machine learning, and as a nice side effect a classical algorithm popped out,” Tang said in an interview with GeekWire. 2018-12-05 18:00:18-08:00 Read the full story.   2019 Predictions: How We Learned to Stop Worrying and Love AI I recently got the chance to share a couple predictions about what 2019 will bring to the world of Artificial Intelligence via Forbes. Thought I’d expand on them here. As someone who has been fascinated with AI and machine learning since the earliest days of my career, the rapid pace of progress in the field has been astounding. Almost overnight we went from an era of struggling to do simple image recognition tasks (think: “hot dog, not hot dog”) to AIs powered by deep neural networks capable of understanding and describing complex scenes with more accuracy and speed than humans. The pace of innovation is only accelerating. The tools are getting more powerful, cheaper, and accessible. I spent the first 5 years of my career working on machine learning projects, and I often joke that those 5 years could today be replaced with 5 lines of Python package imports. Which is why my first prediction is that the job title “machine learning engineer” will start to disappear. You don’t need a fancy degree or specialization to harness AI nowadays, these tools are becoming a part of the standard developer toolbox. 2018-12-10 02:09:40.678000+00:00 Read the full story.   Hex – Creating Intelligent Adversaries (Part 2: Heuristics & Dijkstra’s Algorithm) Hex – Creating Intelligent Adversaries (Part 2: Heuristics & Dijkstra’s Algorithm) Demystifying AI Game Opponents In today’s article, we are going to dive deeper into the creation of an intelligent opponent in the game of Hex. In Part 1 of the Hex series, we’ve covered the α-β Pruned Minimax algorithm, which we have used to find optimal moves. However, in order to make use of the Minimax algori… 2018-12-10 00:02:25.483000+00:00 Read the full story.   Don’t believe the hype: There’s nothing wrong with the space station robot CIMON, an AI-powered robot developed by IBM and Airbus, recently acted perfectly normal during interactions with human crew aboard the International Space Station (ISS). A slew of journalists don’t agree with that assessment, but we’ll let you decide. When a crew member tried to ask it to do stuff it got confused, misinterpreted certain voice commands, and generally failed to produce the expected results with any consistency. Yep, sounds like business as usual. If you own a smart speaker, interact with a virtual assistant, or have ever played Zork (okay, maybe not that one) you know exactly how it feels to interact with an ordinary chatbot – and there’s not exactly a chasm between IBM’s Watson and Amazon’s Alexa. We love our AI-powered gadgets when they work, and they’re getting better all the time, but when they don’t they can be frustrating. In Watson’s defense, it’s among the first AI solutions to be tested in space, and certainly the first floating chat bot on the ISS. 2018-12-06 00:00:00 Read the full story.   Liqid, Intel Optane Take Aim at DRAM with New Fabric Package New-gen IT infrastructure provider Liqid and old-school chipmaker Intel have combined forces to produce a new alternative, solid state-based storage memory fabric in an effort to dislodge conventional DRAM (dymanic random-access memory) and bring better efficiencies to data centers. At the Gartner IT Infrastructure, Operations and Cloud Strategies Conference 2018, Liqid, which makes trendy composable disaggregated infrastructure (CDI) and NVMe (non-volatile memory) solutions, on Dec. 5 demonstrated its own and Intel’s Optane solid-state drive solutions that combine the performance of Intel Optane SSDs with Liqid’s new-gen PCIe fabrics to offer projected new levels of performance. 2018-12-05 00:00:00 Read the full story.   How Oracle Is Convincing Institutional Users to Move to Cloud AN FRANCISCO—Oracle CEO Mark Hurd is tasked with many things, but persuading a lot of old-line institutional-IT customers to move as much of their IT infrastructure as possible into a cloud or clouds is one of the biggest and most challenging ones. We’re talking about huge, well-entrenched and wide-ranging systems in data centers run by government agencies, the military, multinational corporations, aircraft makers, oil and gas developers, scientific labs and others that are probably looking to retool their vast IT systems anyway. So, Oracle contends, why not move it all into distant, professionally run data centers, and trust companies like itself and its partners with their crown jewels—which, of course, is all that data? Much easier said than done. 2018-12-04 00:00:00 Read the full story.   Is Predictive Analytics Solving Challenges In Content Creation? Big data has played a fundamental role in the evolution of content marketing. The Editorial Staff at Business.com states that big data has helped by:
  • Providing deeper insights into customer mindsets by tapping data from social networks, shopping activities and other third-party data resources
  • Analyzing the best times to distribute content for ROI
  • Tracking various factors that influence marketing campaigns with data driven tracking tools
Some of the most recent advances in big data have been especially significant. Predictive analytics has been especially valuable. But how significant is predictive analytics in content marketing? Martech Advisors reports that companies using predictive analytics in their campaigns are able to get more traction with 77% less content. 2018-12-05 17:49:43+00:00 Read the full story.   Machine Learning is Moving Corporate VPN Security into The 21st Century Machine learning has played a crucial role in digital security. According to experts on digital security, machine learning is playing a vital role in the creation of new VPN solutions and corporate VPN security. The Journal of Cyber Security Technology published a study in 2017 titled “Comparison of machine-learning algorithms for classification of VPN network traffic flow using time-related features.” This provided a clear overview of machine learning algorithms used by various VPN networks and the value they provided. They showed that machine learning helped most VPNs achieve 90% accuracy. 2018-12-06 21:13:32+00:00 Read the full story.   Artificial Intelligence for Data Analytics and the Customer Experience Industries are looking out for more and more AI-based solutions to transform customer experience. It has become the buzzword in order to reduce customer efforts, make processes tech driven and assure the best services. One more aspect that AI and Machine Learning are covering is the proactive approach and personalized experience for the customers. Companies are trying to develop solutions that track, or predict the needs of their customers and enable the organizations to cater their customers with such solutions beforehand. Each and every customer is valuable hence a personalized service makes a mark for the brands to stand out in the eyes of the consumer or end users. The world is changing and so are the customer journeys and experiences. Technology is one of the major enablers to exceed customer expectations and an effective way to understand the changing customer trends. 2018-12-06 00:30:11-08:00 Read the full story.   ‘Are you there, God? It’s me, Alexa.’ Tech and religious leaders ponder the future of AI together Can a computer become God? Or, more to the point, could humans invent AI that can convincingly impersonate God — and if so, would humans bother to worship it? That was one of the questions explored Saturday by technology experts, the faithful and everyone in between at a conference devoted to artificial intelligence and faith at Seattle Pacific University. The concept of “AI Almighty” might not be as outlandish as it seems. Last year, a former Uber engineer founded a nonprofit religious organization called The Way of the Future. Its mission: creating an AI deity. New York Magazine writer Andrew Sullivan wrote Friday that America’s religious inclinations are succumbing to other devotions. 2018-12-09 21:03:19-08:00 Read the full story.   Inspecting bridges is hard and dangerous. Send in the drones If you commute over bridges and overpasses each day, there’s a fairly good chance that some of them might need repair (more than 54,000 U.S. bridges are rated “structurally deficient”). As the country’s infrastructure ages, bridge inspections get even more important, but the work is expensive, time-consuming, and dangerous, requiring engineers to rappel down the side of a bridge hundreds of feet above a river. It’s a task that might be better suited for drones. In two recent bridge inspections–one at the Stone Arch Bridge in Minneapolis, and the other at the Daniel Carter Beard Bridge at the Ohio-Kentucky border–Intel partnered with transportation officials to use drones to capture detailed high-res images of each structure. “It’s collecting a series of images, and the second part is actually stitching these images to gather and recreate what you call a digital twin,” says Anil Nanduri, general manager of Intel’s drone group. “Imagine you have a bridge over a river, but then you have an exact replica of it–you can zoom into the finest details, including a sub-centimeter level of detail, on your computer screen.” Current inspections are often done with pen and paper, so the data is difficult to share. A drone’s data is both easier for a group of people to analyze and can be tracked over time. As more and more data is collected, AI and machine learning can begin to automatically highlight cracks, corrosion, or other defects. 2018-12-05 08:00:57 Read the full story.   Barefoot Expands Reach With Tofino 2 Network Chips Barefoot Networks officials for the past couple of years have been driving programmability into networks through its Tofino Ethernet ASIC and the P4 programmable language to address the rising demand for more bandwidth and features to address new workloads like artificial intelligence and machine learning. ASICs for more than two decades have been fixed-function chips, wired to do only one function, according to Ed Doe, vice president of product, business and strategy for Barefoot. However, the trend within hyperscale and enterprise data centers has been toward greater programmability in silicon and domain-specific architectures. GPUs and digital signal processors have grown in importance to help systems better handle particular tasks, and now there are technologies like Tensor processing units to address AI workloads. Intel and Xilinx also have driven the development of field-programmable gate arrays (FPGAs), chips that can be reprogrammed via software. What’s been needed is similar programmability in networks, Doe told eWEEK. 2018-12-04 00:00:00 Read the full story.   Augmented Reality (AR) and AI Self-Driving Cars …Back when I was helping my teenagers learn to drive, Augmented Reality was still being established and it was relatively crude and a computer cycles hog. The advent of having AR on a smartphone that could update in near real-time was a sign that AR was finally becoming something that could touch the masses and not be only relegated to very expensive goggles. During my teaching moments about driving a car, I had dreamed that it would be handy to have a Heads-up Display (HUD) on the car that would make use of a virtual-world overlay on the real-world so that I could do more than just pretend in our minds that there were various obstacles in the parking lot. I would have liked to have the entire front windshield of the car act like a portal that would continue to show the real-world, and yet also allow an overlay of a virtual world… 2018-12-07 15:45:07+00:00 Read the full story.   Morgan Stanley says market for self-flying cars could rise to $1.5 trillion by 2040 The market for autonomous flying cars — also known as eVTOL aircraft, air taxis or personal air vehicles — could amount to nearly $1.5 trillion by the year 2040, according to an in-depth analysis from Morgan Stanley Research. The financial company’s 85-page report, distributed to clients this week, draws together data from a host of sources, including a private-public symposium on urban air mobility that was conducted last month in Seattle. “We see the development of the UAM [urban air mobility] ecosystem as extremely long-dated and requiring up-front capital allocation, testing and development in the short term, with increasing visibility;” said Morgan Stanley’s research team, which includes senior analyst Adam Jonas. 2018-12-08 02:33:54-08:00 Read the full story.   Customer experience: A new marketing strategy for your brand AI’s most significant impact is on CX management, particularly in online shopping. A study on AI and chatbots found that when an online store uses AI, 49% of consumers would buy from it more often, and 38% would share their experiences with family and friends. Delivering personalized, interactive customer experiences is the biggest benefit of tapping AI for your marketing strategy. Some of the world’s leading brands—including Google, Apple, Microsoft, Skype, and Amazon—are doing it through chatbots. Such AI-powered technology allows businesses to help customers find an item they need, score the best prices and deals, view recommendations, and more. AI can even help you serve customers before they ask for it. Starbucks, for instance, uses AI to suggest orders before customers decide on what they want and to guess their orders in their next visits. 2018-12-05 08:50:00+00:00 Read the full story.   Trello acquires Butler, a tool for automating repetitive tasks Trello, a web-based project management app launched seven years ago by Frog Creek Software and later acquired by Atlassian for $425 million, is a powerful collaboration tool. …with the help of artificial intelligence (AI), Butler learns from the past three months of Trello board activity to automatically suggest candidates for automation… 2018-12-10 00:00:00 Read the full story.   DeepMind’s AlphaZero now showing human-like intuition in historical ‘turning point’ for AI DeepMind’s artificial intelligence programme AlphaZero is now showing signs of human-like intuition and creativity, in what developers have hailed as ‘turning point’ in history. The computer system amazed the world last year when it mastered the game of chess from scratch within just four hours, despite not being programmed how to win. But now, after a year of testing and analysis by chess grandmasters, the machine has developed a new style of play unlike anything ever seen before, suggesting the programme is now improvising like a human. 2018-12-06 00:00:00 Read the full story.   Build a Face Detection Model on a Video using Python The wonderful field of Computer Vision has soared into a league of it’s own in recent years. There are an impressive number of applications already in wide use around the world – and we are just getting started! One of my favorite things in this field is the idea of our community embracing the concept of open source. Even the big tech giants are willing to share new breakthroughs and innovations with everyone so that the techniques do not remain a “thing of the rich”. One such technology is face detection, which offers a plethora of potential applications in real-world use cases (if used correctly and ethically). In this article, I will show you how to build a capable face detection algorithm using open source tools. Here is a demo to get you excited and set the stage for what will follow: 2018-12-10 11:02:03+05:30 Read the full story.   How software-defined networking pulls the strings of multicloud The multicloud world has well and truly arrived. Organisations today make use of multiple public clouds such as AWS, GCE and Microsoft Azure as well as many cloud base services (e.g. SaaS, IaaS), and they are using these in addition to on-premises workloads and applications that run in private clouds hosted in one or more data centres. However, with this transformation, networking and the security of the network in a multicloud environment become more complex and a significant challenge. 2018-12-05 00:00:00 Read the full story.   A Practical Guide to Object Detection using the Popular YOLO Framework How easy would our life be if we simply took an already designed framework, executed it, and got the desired result? Minimum effort, maximum reward. Isn’t that what we strive for in any profession? I feel incredibly lucky to be part of our machine learning community where even the top tech behemoths embrace open source technology. Of course it’s important to understand and grasp concepts before implementing them, but it’s always helpful when the ground work has been laid for you by top industry data scientists and researchers. This is especially true for deep learning domains like computer vision. Not everyone has the computational resources to build a DL model from scratch. That’s where predefined frameworks and pretained models come in handy. And in this article, we will look at one such framework for object detection – YOLO. It’s a supremely fast and accurate framework, as we’ll see soon. So far in our series of posts detailing object detection (links below), we’ve seen the various algorithms that are used, and how we can detect objects in an image and predict bounding boxes using algorithms of the R-CNN family. We have also looked at the implementation of Faster-RCNN in Python. In part 3 here, we will learn what makes YOLO tick, why you should use it over other object detection algorithms, and the different techniques used by YOLO. Once we have understood the concept thoroughly, we will then implement it it in Python. It’s the ideal guide to gain invaluable knowledge and then apply it in a practical hands-on manner. 2018-12-06 07:49:52+05:30 Read the full story.   At Tempus, the team is changing health care – and the world It’s not often you hear someone in the workplace say they’re hoping to change the world. But at Tempus, it’s the norm.  With a data-first mentality, Tempus hopes to improve outcomes in the health care sector by collecting and pipelining a library of clinical and molecular data, and then shepherding it through a machine learning platform to help physicians draw insights for cancer patients across the country — including at premier hospital networks like the Mayo Clinic and Cleveland Clinic. We spoke with Tempus team members from the design, product and engineering teams about how collaboration, when combined with a passion for health care, is changing patients’ lives. 2018-12-07 00:00:00 Read the full story.   Regex on the Texts of Harry Potter – Towards Data Science Throughout this NLP project, I’ve been using as my text corpus the collection of seven books about Harry Potter. But before I could send these books through my algorithms, I first had to save the pdfs as txt files and then extract the chapters as separate documents. To do this, I used regular expressions. For a good intro to regex in Python, check out Google’s quick course. Regular expressions can be thought of as wildcard search on crack. They allow an incredible amount of flexibility to describe patterns in text strings. If you’ve never seen them before, a regular expression pattern can look like a jumbled, senseless mess. But there is order in that chaos, and a great deal of power. Let’s explore this with the pattern I used on these texts: 2018-12-09 23:55:42.977000+00:00 Read the full story.   Why you need a supercomputer to build a house When the hell did building a house become so complicated? Don’t let the folks on HGTV fool you. The process of building a home nowadays is incredibly painful. Just applying for the necessary permits can be a soul-crushing undertaking that’ll have you running around the city, filling out useless forms, and waiting in motionless lines under fluorescent lights at City Hall wondering whether you should have just moved back in with your parents. And to actually get approval for those permits, your future home will have to satisfy a set of conditions that is a factorial of complex and conflicting federal, state and city building codes, separate sets of fire and energy requirements, and quasi-legal construction standards set by various independent agencies. It wasn’t always this hard – remember when you’d hear people say “my grandparents built this house with their bare hands?” These proliferating rules have been among the main causes of the rapidly rising cost of housing in America and other developed nations. The good news is that a new generation of startups is identifying and simplifying these thickets of rules, and the future of housing may be determined as much by machine learning as woodworking. 2018-12-08 00:00:00 Read the full story.   Cloudera previews Kubernetes-specific machine learning platform Cloudera Inc. today announced a preview of a new cloud-native machine learning platform that runs on Kubernetes, the popular orchestration platform for software containers. Containers are portable, self-contained software environments that include code and all dependencies to able applications to run reliably in multiple computing environments. The company said the new Cloudera Machine Learning platform will deliver fast provisioning and automatic scaling as well as containerized, distributed processing in heterogeneous computing environments. It’s intended to combine secure data access with a unified experience across on-premises, public cloud and hybrid environments. Secure data access spans Hadoop’s HDFS file system, cloud object stores and external databases. 2018-12-05 00:00:00 Read the full story.  
Behind a Paywall/Registration Wall… Waitrose first supermarket to use robots to farm its food British supermarkets are to start selling food farmed in the UK by robots for the first time in a project led by Waitrose, the Telegraph has learned.  The supermarket will use autonomous farming robots to analyse, plant and protect crops from weeds at a farm near Stockbridge, Hampshire. In a three-year trial, the robots  – known as Tom, Dick and Harry – will start cultivating fields used to grow wheat for bread and flour sold in Waitrose stores. The robots, developed by Shropshire-based start-up the Small Robot Company, uses artificial intelligence to scan thousands of pictures of a specific field. The images allow them to spot weeds and plant seeds in the best location. 2018-12-08 00:00:00 Read the full story.   Alium Capital backs market research start-up Sydney-based investment firm Alium Capital has bought into a $4.5 million round from US start-up Respondent, as its Aussie co-founder Jack Pratten looks to head back Down Under. The company provides an online marketplace for businesses wanting to find participants for qualitative market research, and also provides individuals with the opportunity to get paid for taking part in the research studies. Co-founder Pratten studied at the University o… 2018-12-09 00:00:00 Read the full story.   Woolworths changes tack on data sharing rules with suppliers Woolworths now intends to provide data on competitors’ sales and sales growth in quartiles, rather than dollar sales or exact sales growth, as well as publicly available information, such as product descriptions, pricing and distribution. Woolworths believes the proposed changes will give suppliers enough information to compare their performance against that of rival brands but not enough information to enable them to rush a copy-cat product to … 2018-12-10 00:00:00 Read the full story.   Wall Street says Amazon and VMware are teaming up to take down Microsoft in the cloud wars Amazon’s annual cloud conference, AWS re:Invent, is a gargantuan spectacle of crowds, electronic dance music, and a seemingly never ending stream of new announcements. Most of all, it’s a finely choreographed production designed to hammer home the message of Amazon Web Services’ dominance in the cloud-computing market. Financial-research firm Wedbush Securities estimates that about 30% of computing workloads are in the cloud today and that the … 2018-12-08 00:00:00 Read the full story.  
This news clip post is produced algorithmically based upon CloudQuant’s list of sites and focus items we find interesting. If you would like to add your blog or website to our search crawler, please email customer_success@cloudquant.com. We welcome all contributors. This news clip and any CloudQuant comment is for information and illustrative purposes only. It is not, and should not be regarded as “investment advice” or as a “recommendation” regarding a course of action. This information is provided with the understanding that CloudQuant is not acting in a fiduciary or advisory capacity under any contract with you, or any applicable law or regulation. You are responsible to make your own independent decision with respect to any course of action based on the content of this post.