AI & Machine Learning News. 05, November 2018

Google Wants You to Use A.I. to Save the World

Can artificial intelligence (A.I.) save the world? That seems to be Google’s thinking. Its new program, AI for Social Good, bills itself as a way for A.I. to solve “some of the world’s biggest challenges,” from predicting natural disasters to saving vulnerable species from extinction. As part of that initiative, Google has launched the Google AI Impact Challenge, which will pay out grants from a $25 million pool to any researchers who come up with good proposals for saving the world through A.I. “Grantees will also join a specialized Launchpad Accelerator program, and we’ll tailor additional support to each project’s needs in collaboration with data science nonprofit DataKind,” added Google’s official blog posting on the matter. “In spring of 2019, an international panel of experts, who work in computer science and the social sector, will help us choose the top proposals.” 2018-11-04 00:00:00 Read the full story.

Five projects that are harnessing big data for good

We argue that the data science boom shouldn’t be limited to business insights and profit margins. When used ethically, big data can help solve some of society’s most difficult social and environmental problems. Industry 4.0 should be underwritten by values that ensure these technologies are trained towards the social good (known as Society 4.0). That means using data ethically, involving citizens in the process, and building social values into the design. Here are a five data science projects that are putting these principles into practice:
  1. Finding Humanitarian Hot Spots
  2. Improving Fire Safety In Homes
  3. Mapping Police Violence In The US
  4. Optimising Waste Management
  5. Identifying Hotbeds Of Street Harassment
2018-11-02 10:26:41+11:00 Read the full story. CloudQuant Thoughts… I put these two together as I thought the latter may give you some inspiration for the former. Or perhaps something from another article further down about how students at UW Paul Allen school of Computer Science have been using tech to help find why only some people are afflicted with Altzheimers or using AR to annotate conversations in the real world. As you have seen highlighted in previous weeks, AI can be used for so many positive things, get your thinking caps on!  

AI’s bias problem — and what we need to do to solve it

Today, artificial intelligence (AI) has spread into all corners of life. Take Amazon Alexa for example. We wouldn’t even bat an eye at someone asking Alexa to play a song, hail an Uber, or pull together a grocery list. AI is becoming just as common in business too. Virtually everyone has watched a movie or TV show on Netflix. Those suggested titles? That’s Netflix using AI to review past watching behavior to recommend new content to keep you engaged and subscribed. AI presents tremendous value for Amazon, Netflix, and practically any modern company, helping to unlock new insights from data and guide major decisions. But even with everything possible with AI, there are a few things to watch out for — high on the list: unintended bias. We need to recognize where potential biases come from to prevent problems from popping up in our AI applications and make sure they deliver the intended results. 2018-10-17 Read the full story. CloudQuant Thoughts… A few articles combined into a thought I have had before about how a stock market would look if all the participants were bots, surely the price would eventually just “settle”. One article this week on using Machine Learning to design company Logos made me think about it as well.. wouldn’t all logos drift down to a single simple two color logo? A youtube video on game design at a Dutch firm talked about how their ML development system would close in on the ideal “point of a cone” using a “sinusoidal  path” down into the cone. Is this all descending into a single point?  

AI to ‘reshape rather than replace’ finance workforce

The job market within the financial service sector is set for a significant shift as the revolution in artificial intelligence (AI) reshapes processes and roles, a panel told its audience at this year’s Money20/20. For Gregory Simpson, senior vice president and chief technology officer at Synchrony, while the workforce will change there may not be a widescale loss of opportunities in the marketplace. “We’ve taken a very conscious approach about how to re-skill people, because there will be different types of jobs,” he said. “We talk about AI as augmented intelligence versus artificial intelligence – and how augmentation can change jobs, and how prepared people, are to fill those roles. There will be some jobs that need replaced, and in some cases there will be a displacement of other types of automation.”. “In some cases it’s creating new jobs with people needed to build that automation. Our audit team is talking about hiring data scientists rather than auditors, for instance,” he added. Echoing the idea that financial services will embrace automation in a manner that is supplemented by a human workforce, Sriniketh Chakravarthi, senior vice present, banking and financial services, Cognizant, used changes in Europe’s banking sector as an indication of how the market will pivot. “If you go to Europe there’s no such thing as cheque processing – still they have it in the US but it’s eliminated in other parts of the world. Have they lost jobs in Europe because cheque processing has gone away? Not quite, because they’ve added on new jobs… 2018-11-02 00:00:00 Read the full story. CloudQuant Thoughts… Interesting point that Europe has all but eliminated checks yet the banks have not reduced staffing levels, instead they have pivoted their people into other roles.  

5 Bite-Sized Data Science Summaries – Cassie Kozyrkov

In the spirit of teamwork, the Next Rewind video series asked a bunch of people to pick up to five favorite talks from Google Cloud Next SF 2018 and discuss them on camera in no more than five minutes. Here are my 5 favorite talks from Next 2018 and the reasons I picked them. I got first dibs to choose out of over 300 talks, so these topics aren’t offcuts! They’re really the ones I think data science enthusiasts will appreciate most.
  1. Real businesses are already using AI for fun and profit!
  2. What is machine learning and how do I eat it? (Without a PhD)
  3. You can do machine learning in SQL now(!!)
  4. Data scientists, you no longer need a black belt in infrastructure
  5. TensorFlow is on a trajectory of increasing cuddliness
2018-11-03 00:46:20.271000+00:00 Read the full story. CloudQuant Thoughts… It has been a few weeks since we had a post by Cassie at Google.  

Jim Cramer: Why We Need Definitively Mixed Data Like We Are Getting

Do we want good news or do we want bad news? At this point in the cycle, with the Fed wanting to tighten like mad we definitively and sadly want bad news, we want weaker than expected data and we literally have to hope for a weak jobless number on Friday. How did we get to this point? Because the stock market changed its coloration, or perhaps I should say its species, on two days in October, the third and the fourth. On the third as my friend Ed Yardeni accounts in his incredibly good Yardeni Report, Fed Chief Jerome Powell told Judy Woodruff on PBS that “the really accommodative low interest rates that we needed when the economy was quite weak, we don’t need those anymore. At the same time Powell was quoted by CNBC as saying “interest rates are still accommodative, but we are gradually moving to a place where they will be neutral. We may go past neutral but we’re a long way from neutral at this point, probably.” The next day? That’s the day that Vice President Pence laid out our cold war battle plan versus China in a speech that I think could easily lead one to believe that we favor regime change in the Peoples Republic. The speech talked about how China is waging a multi-front campaign against our nation. “Beijing is employing a whole of government approach, using political, economic and military tools as well as propaganda to advance its influence and benefit its interests in the United States. ” This was the speech that laid out for all to see that there is a much bigger agenda than trade at stake. It’s multi-faceted war. First there’s the made in China 2025 plan that must be stopped. To quote Pence again: “The communist party has set its sights on controlling 90% of the world’s most advanced industries including robotics, biotechnology and artificial intelligence.” 2018-10-30 15:02:35-04:00 Read the full story.

Jim Cramer: I Think That the Cloud Kings Remain Red Hot

Who is right about the data center? Who can tell us if there is a slowdown or not? When we look at Friday’s decline and fall of anything having to do with cloud and the data, we find ourselves wondering if this is the end of the greatest secular theme of the era, the expansion of the cloud as the heart of information technology spending. Now, it takes a lot to stick a fork in a tremendous secular theme. But you can see how it can happen in this, an environment where, until today everything seems to be going wrong. I have been saying for weeks that there is a new narrative that the cloud is slowing down but it’s been so anecdotal that I always feel like I am shadow-boxing. You can’t pin down who or what is behind the myth or actuality of the cloud hiccup, blip or de-acceleration other than to say “okay, nothing can last this long.” 2018-10-29 11:33:55-04:00 Read the full story. CloudQuant Thoughts… I do not normally promote Jim Cramer’s posts into this top section but I thought his first post perfectly captured the Volatility in the market at the moment, the post gives a good impression of what it feels like to be trading in the market at right now. The second post is interesting as an outsiders view of the Cloud business based purely on the returns of the component businesses like WDC – Western Digital, who supply hard drives to the data centers.  

Best Machine Learning GitHub Repostories & Reddit Discussions

This month’s collection comes from a variety of use cases – computer vision (object detection and segmentation), PyTorch implementation of Google AI’s record-breaking BERT framework for NLP, extracting the latest research papers with their summaries, among others. Scroll down to start learning! Why do we include Reddit discussions in this series? I have personally found Reddit an incredibly rewarding platform for a number of reasons – rich content, top machine learning/deep learning experts taking the time to propound their thoughts, a stunning variety of topics, open-source resources, etc. I could go on all day, but suffice to say I highly recommend going through these threads I have shortlisted – they are unique and valuable in their own way. GITHUB
  • Faster R-CNN and Mask R-CNN in PyTorch 1.0
  • Tencent ML Images (Largest Open-Source Multi-Label Image Database)
  • PyTorch Implementation of Google AI’s BERT (NLP)
  • Extracting Latest Arxiv Research Papers and their Abstracts
  • DeepMimic
  • Bonus: AdaNet by Google AI
Reddit Discussions
  • What Developments can we Expect in Machine Learning in the Next 5 Years?
  • Advice for a Non-ML Engineer who Manages Machine Learning Researchers
  • Topic Ideas for Machine Learning Projects
  • Why do Machine Learning Papers have Such Terrible Math?
  • The Disadvantages of the Hype Around Machine Learning
2018-11-01 10:44:15+05:30 Read the full story. CloudQuant Thoughts… oooooh meta…. a summary of posts quoting a summary of posts! These are always very nice summaries of interesting ML and AI posts on Github and Reddit.  

AI lie detectors to be tested by the EU at border points

ie detectors equipped with artificial intelligence are set to be tested at border points in Europe as part of an EU-funded project to combat crime and terrorism. Travellers will be asked to upload pictures of their passport, visa and proof of funds, and will then use a webcam to answer questions such as “what is in your suitcase” from a computer-animated border guard. The €4.5m project, called iBorderCtrl, will be tested at the borders of Hungary, Latvia and Greece for six months. The aim of the project is to speed up traffic at the EU’s external borders. The UK, Spain, Poland, Germany and Cyprus also plans to participate in the project following initial trials. The technology is advertised as having a “unique approach to deception detection”, analysing the micro-expressions of travellers to figure out if the interviewee is lying.” Travellers deemed low risk during the pre-screening stage will go through a short re-evaluation of their information for entry, while higher-risk passengers will undergo a more detailed check. 2018-11-01 00:00:00 Read the full story. CloudQuant Thoughts… WOW! What do you think about this one… it came out of left field! AI border guards reading humans’ “micro-expressions”!  

No doubt about it: The new iPad Pro is a computer

At Apple’s product launch in Brooklyn, Tim Cook introduced the new iPad Pro by calling the iPad the most popular computer in the world–based on the fact that yearly unit sales are larger than the total sales of the rest of the industry’s Windows laptops. And the new iPad Pros indeed look more like powerful computers than ever, despite their newly svelte exteriors. 2018-10-30 12:09:57 Read the full story. CloudQuant Thoughts… We are on the verge of splitting into two groups, the creators – who access real computers, and the consumers – who access tablet/phone computers. The former can install the software, the latter can only purchase software from the localized app store. I feel like we may be cutting off next generation here. How are we ever going to have new ideas and new creatives if they are walled into using pre-approved “store” software? Where will the next programmers come from? Also, Apple no longer wishes to tell us how many iPhones they are selling. Why? Because we are at peak phone. They prefer to tell us how well their “services” are performing, again choosing to leave out the details, leaving us to work out if they are raking in money from the store, or maybe they are finally catching Google and Amazon in the cloud race?  However!  Income from Google goes into services so does that mean 2/3rds of that $15b is from “$9b default search fee on iOS”? Apple does not sell your data! But Google sees $9b value in all those high end iPhone users. And…. just in case you missed it.. Helium will destroy your iPhone (or at least knock it out for a week!).  
Below the Fold… Hot Sauce.. Tomato Sauce… A1 Sauce? Here are a few more sauces/sources to get you going…

My secret sauce to be in top 2% of a kaggle competition

Competing in kaggle competitions is fun and addictive! And over the last couple of years, I developed some standard ways to explore features and build better machine learning models. These simple, but powerful techniques helped me get a top 2% rank in Instacart Market Basket Analysis competition and I use them outside of kaggle as well. So, let’s get right into it! One of the most important aspects of building any supervised learning model on numeric data is to understand the features well. Looking at partial dependence plots of a model helps you understand how the model’s output changes with any feature. But, the problem with these plots is that they are created using a trained model. If we could create these plots from train data directly, it could help us understand the underlying data better. In fact, it can help you with all the following things:
  • Feature understanding
  • Identifying noisy features (the most interesting part!)
  • Feature engineering
  • Feature importance
  • Feature debugging
  • Leakage detection and understanding
  • Model monitoring
In order to make it easily accessible, I decided to put these techniques into a python package featexp and in this article, we’ll see how it can be used for feature exploration. 2018-11-05 02:25:48.309000+00:00 Read the full story.  

Google open-sources BERT, a state-of-the-art pretraining technique for natural language processing

Natural language processing (NLP) — the subcategory of artificial intelligence (AI) that spans language translation, sentiment analysis, semantic search, and dozens of other linguistic tasks — is easier said than done. Procuring diverse datasets large enough to train text-parsing AI systems is an ongoing challenge for researchers; modern deep learning models, which mimic the behavior of neurons in the human brain, improve when trained on millions, or even billions, of annotated examples. One popular solution is pretraining, which refines general-purpose language models trained on unlabeled text to perform specific tasks. Google this week open-sourced its cutting-edge take on the technique — Bidirectional Encoder Representations from Transformers, or BERT — which it claims enables developers to train a “state-of-the-art” NLP model in 30 minutes on a single Cloud TPU (tensor processing unit, Google’s cloud-hosted accelerator hardware) or a few hours on a single graphics processing unit. 2018-11-02 00:00:00 Read the full story.  

Facebook Open Sources Horizon to Streamline the Implementation of Reinforcement Learning Solutions

Reinforcement learning is one of the most exciting areas of development in the current artificial intelligence(AI) landscape. From AlphaGo to OpenAI Five, reinforcement learning has been at the center of major AI breakthroughs in the last few years. And yet, the implementation of reinforcement learning remains difficult enough that only specialized teams with advanced AI research skills have been able to pursue those efforts. Not surprisingly, most of the interesting reinforcement learning applications we see today come from AI powerhouses like Google, Microsoft, Amazon, Apple or Facebook. In the case of Facebook, the social media giant has been using reinforcement learning across different scenarios such as intelligent notifications or the M assistant. Recently, the Facebook engineering team open sourced Horizon, a framework that brings together some of the best practices Facebook has learned in the implementation of reinforcement learning solutions so that they can be used by mainstream developers. 2018-11-05 13:14:00.841000+00:00 Read the full story.  

Open Source Model Management Roundup: Polyaxon, Argo, and Seldon

One of the most common questions the Anaconda Enterprise team receives is something along the lines of: “But really, how difficult is it to build this using open source tools?” This is certainly a fair question, as open source does provide a lot of functionality while offering a lower entry price than an enterprise platform. Furthermore, a platform you buy will never be as customized as a platform you build for your specific use cases. There are pros and cons for each approach, and the direction you decide to take depends on what business parameters you are trying to optimize at your organization at a very specific moment in time. In its simplest form, model management can be seen as training one machine learning model, then repeating this tens, hundreds, or thousands of times with different data, parameters, features, and algorithms to finally deploy the “best” one. A more complete definition would be that model management involves developing tooling and pipelines for data scientists to develop, deploy, measure, improve, and iterate so they can continue making better models not only for one particular problem but for wider ranges of datasets and algorithms. At the same time, model management carries the requirements of more traditional applications such as API development and versioning, package management, containerization, reproducibility, scale, monitoring, logging, and more. You can get so much value from ML compared to more traditional applications, but the investment you need to make can be overwhelming. This blog post examines some of the exploration, experiments, and prototypes that we at Anaconda have performed around our model management features in recent months, and proposes an open source model management pipeline built on top of Kubernetes. 2018-10-30 19:42:41+00:00 Read the full story.  

Dirty Sauce – How Researchers Are Using Restaurant Reviews And Data Analytics To Predict Health Risks

All of us rely on online restaurant reviews before we try a new restaurant. These reviews are written by normal people like us not just help people recognise the good and bad restaurants or dishes but also add a greater value. In fact, after taking help from machine learning, this data can also be used to predict health risks. Because reports have suggested that restaurants are the most common source of foodborne illness. In this article, we will discuss one such research where data analytics tools to solve this problem of health risks. Why Does This Research Stand Out? Where Not to Eat? Improving Public Policy by Predicting Hygiene Inspections Using Online Reviews by Jun Seok Kang and Polina Kuznetsova of Stony Brook University, and Michael Luca and Yejin Choi of Harvard Business School tries to find an approach for governments to harness the information contained in social media in order to make public inspections and disclosure more efficient. There have been studies in the past which tried to touch this issue using data analysis. But they concentrated on specific problems like influenza or food-poisoning and that is why they had to pay attention to a very small set of words for the NLP algorithm to train. 2018-11-05 12:01:35+00:00 Read the full story.  

Make people valuable again

There is a disconnection between the pace and progress of the technical achievements made by innovators and entrepreneurs and the ways in which those technologies have added to human happiness. We have increased our technological powers many times and still we are not happier; we do not have more time for the things we find meaningful. We could use our powers for making each other — and thereby ourselves — more valuable, but instead we are fearing to lose our jobs to machines and be considered worthless by the economy. 2018-11-04 00:00:00 Read the full story.  

Experts: AI can generate ‘billions’ for you, but requires the long-term view

Some smart companies are already making billions of dollars from their AI investments, but they have taken a long-term view to achieve those gains. And the more forward-thinking a company is when working on AI, the more likely the technology is to exceed their expectations. Another key finding is that personnel decisions and teamwork are critical. That might seem obvious, but executives said their specific learning over the past year has been that data scientists need to work very closely with domain experts because it’s the latter who really know the business context, and data scientists by themselves can often get it wrong. This is in contrast to common thinking as recently as a year ago, when many people believed AI could often make better decisions than people. 2018-11-02 00:00:00 Read the full story.  

Ford and Baidu team up to test self-driving vehicles in China

Ford and Chinese internet giant Baidu announced a tie-up Wednesday that will see the two firms jointly test self-driving vehicles in China for two years. The initiative will see the two companies collaborate on the development and testing of driverless vehicles that meet the Level 4 standard set by U.S. industry organization SAE International. This means that autonomous vehicles developed by the two will not require intervention from a human driver. Ford and Baidu did not disclose any financial terms or ownership structure details of the venture. “Working with a leading tech partner like Baidu allows us to leverage new opportunities in China to offer innovative solutions that improve safety, convenience and the overall mobility experience,” Sherif Marakby, president and CEO of Ford’s autonomous vehicles unit, said in a statement Wednesday. 2018-10-31 00:00:00 Read the full story.  

MachineHack Launches New Hackathon, Take The Breed Classification Challenge

Our new hackathon is for the pet lovers and image processing enthusiasts out there. Did you know that a survey said that 94% of pet owners say their animal pal makes them smile more than once a day? There are no prizes for guessing which are the most popular pets on the planet: they are cats and dogs. But how do you choose which one to adopt? How do you know if you’re a cat person or a dog person? And finally, how do you choose which breed to adopt (because let’s face it, they all look cute)? To make matters more difficult, there are around 340 breeds recognized by the Fédération Cynologique Internationale (FCI), the governing body of dog breeds, which is also called World Canine Organisation. On the other hand, The International Cat Association (TICA) recognizes 58 standardized breeds of cats. Can you make your computer identify cat and dog breeds with some machine learning magic? 2018-11-01 04:31:20+00:00 Read the full story.  

Explainable Artificial Intelligence- XAI (Part 2) — Model Interpretation Strategies

In this article, we will be picking up from where we left off and expand further into the criteria of machine learning model interpretation methods and explore techniques for interpretation based on scope. The aim of this article is to give you a good understanding of existing, traditional model interpretation methods, their limitations and challenges. We will also cover the classic model accuracy vs. model interpretability trade-off and finally take a look at the major strategies for model interpretation. Briefly, we will be covering the following aspects in this article:
  • Traditional Techniques for Model Interpretation
  • Challenges and Limitations of Traditional Techniques
  • The Accuracy vs. Interpretability trade-off
  • Model Interpretation Techniques
2018-10-31 20:28:24.499000+00:00 Read the full story.  

8 must-read books on Statistics & Mathematics for Data Science – Alyssa Johnson

There are a number of points one must keep in mind while trying to master data science. Especially for those who are surrounding themselves with numbers and mountains of information. What’s more, the top organizations around the world are constantly in need of data science experts and specialists. So, it helps to continuously refresh your knowledge of the basics of data science. Top 8 Best Books on Statistics and Mathematics
  • Pattern Classification — Richard O Duda
  • Practical Statistics for Data Scientists: 50 Essential Concepts — Peter Bruce and Andrew Bruce
  • Naked Statistics: Stripping the Dread from the Data — Charles Wheelan
  • R for Data Science: Import, Tidy, Transform, Visualize, and Model Data — Hadley Wickham and Garrett Grolemund
  • Introduction to Linear Algebra — Gilbert Strang
  • Introduction of Math of Neural Networks — Jeff Heaton
  • Advanced Engineering Mathematics — Erwin Kreyszig
  • Elements of Statistical Learning — Trevor Hastie and Rob Tibshirani
2018-10-31 12:07:29+00:00 Read the full story.  

The Finanser’s Week: 29th October – 4th November

  The main blog headlines are … Clash of clans … or new bank versus old bank (Fidor, BPCE) AI is only as good as the people who program it Gartner: you’re better than this Apps, DApps and Super-Apps Keep on grafting This week’s top news headlines are … Deutsche Bank in need of ‘radical surgery’ as profits plunge – The Telegraph HSBC profits soar as bank keeps a ‘strong grip’ on costs and flourishes in Asia – City AM Girl offers piggybank to masked gunmen to try and make them leave family alone, police reveal – The Telegraph Bitcoin may cause catastrophic climate change by 2033, study warns – The Telegraph Venezuela Is Said to Move Cash Through an Obscure Russian Bank – Bloomberg Reserve Bank of Australia Official ‘Not Convinced’ of Need for Digital Dollar – Coin Telegraph Danske Bank drops to last place in private banking survey: Prospera – Reuters Wells Fargo says aware about issues with ATM and credit card transactions – Reuters Deutsche Bank dismisses boss of fund manager DWS – Financial Times World’s billionaires became 20% richer in 2017, report reveals – Guardian 2018-11-04 09:06:37+00:00 Read the full story.  

Overcome Your Biases with Data – Towards Data Science

hortly after moving to Boston, I thought I noticed a striking phenomenon: loads of people smoking. After a few days, it seemed to me that every street corner was filled with people lighting up cigarettes. Having come from a small midwestern town where it was exceedingly rare to see anyone smoking, I was dismayed: maybe the big city encouraged negative vices I would eventually pick up, or worse, smoking rates were on the rise nationwide. While a few decades ago I would have had no option but to either persist in this belief or painstakingly look for demographic data in a library, now I was able to find verified data from the Centers for Disease Control and Prevention within seconds. To my surprise, and dealing a large blow to my rational view of myself, I found the following table comparing smoking rates in the metro area nearest my small town (Peoria, IL) to those in Boston…  Not only was I wrong, I was significantly wrong as indicated by the non-overlapping 95% confidence intervals. 2018-11-02 01:43:46.908000+00:00 Read the full story.  

AI system predicts business cyberattacks hatched in dark web forums

An international cybersecurity research team has developed an AI system that predicts business targeted cyberattacks from forum discussions on the dark web. The dark web refers to the underbelly of the internet, cut loose from search engine indexes and accessible only via specialist browsers like Tor. Darkweb or deep-web marketplaces and forums are well known for being the go-to place for purchasing illegal drugs, guns, and forged documents online. The same forums and marketplaces are often teeming with hackers anonymously discussing vulnerabilities and selling malicious software that exploits them. Popular Russian hacking site FreeHacks has 5,000 active members who enthusiastically discuss techniques like “carding” (the term for credit card theft attacks) and “phreaking” (breaching someone’s security network). Until now, their discussions have not been analysed at scale. 2018-11-01 00:00:00 Read the full story.  

Report: Microsoft picks new chip maker for AI co-processors on Azure, cutting Intel’s business in half

Xilinx is reportedly the beneficiary of a decision by Microsoft Azure’s infrastructure wizards to add another chip supplier to its ranks, as it looks to serve more customers interested in machine learning. Bloomberg reported Tuesday that Microsoft and Xilinx have reached an agreement where Xilinx chips will account for half of the co-processors currently used on Azure servers to handle machine-learning workloads, one of the strongest areas of growth in the cloud. To date, that was the exclusive domain of Intel’s Altera division, which the chip maker acquired in 2015 to offer both its general-purpose processors as well as reconfigurable chips like Xilinx’s co-processors to cloud computing vendors. Flexible chips that can be configured to run machine-learning services have become hot commodities over the past several years. Xilinx was actually the first to bring the concept of FPGAs (field-programmable gate arrays) to the market in the 1980s, and it has been butting heads with Altera ever since, but for the most part the market for such chips was relatively limited. 2018-10-30 21:02:43-07:00 Read the full story.  

Five Tips to Completing Analytics’ Infamous “Last Mile” in 2019

In 2017, the term “Data Scientist” was LinkedIn’s fastest growing job title; yet, in the same year, McKinsey reported that less than 10 percent of Analytic Models that are developed actually make it to production where they can deliver ROI. The bottleneck lies in what industry insiders call “the last mile”, which is when models are moved into production after they’re built and trained. As we look ahead to the New Year, I predict 2019 will be the year the Data Science industry makes huge moves toward removing last mile challenges by making progress on five key fronts:
  • IT will prioritize model productionization.
  • “Model Deployment” and “Model Management” will become the buzz but it will also lead to confusion.
  • Data Scientists will get help in the form of ModelOps support.
  • Enterprises will begin to centralize model deployment and management.
  • Consensus will emerge on best practices for model deployment and management.
2018-11-05 00:30:49-08:00 Read the full story.  

Savvy Entrepreneurs Are Using Machine Learning to Streamline Logo Design

Machine learning has impacted the marketing profession in countless ways. A few months ago, I discussed the ways that Moz uses machine learning to develop its domain authority metrics, which is one of the best approximations of a website’s ability to rank in Google. However, there are other applications of machine learning in marketing that are much more prevalent, such as logo development. A growing number of entrepreneurs are realizing that machine learning is incredibly valuable for developing logos. One entrepreneur wrote a post on Indie Hackers about their own logo generator. The author pointed out that machine learning enabled them to create high-quality logos much more quickly than users could make them with Photoshop or other graphic design tools. Due to the innovativeness of their logo generator, they were able to earn $70,000 a month from it. Other companies like Oberlo have developed an even more impressive logo creator online. 2018-11-01 12:03:25+00:00 Read the full story.  

Driving Digital Transformation Using Secondary Storage

For decades, IT strategy focused on deploying, managing and optimizing the infrastructure used to run the ERP, CRM, office productivity, and other applications used by businesses, non-profits and government agencies to operate their enterprises. IT infrastructure vendors were the stars of the show, constantly seeking to one-up each other regarding their hardware’s “speeds and feeds” in order to secure coveted spots in enterprise data centers. However, the rise of hyperconverged infrastructure, Software as a Service (SaaS), and the cloud has completely disrupted the IT market, and with it, enterprises’ infrastructure-centric mindset. With a variety of scalable, inexpensive hyperconverged infrastructure using “commodity” hardware now an option for enterprise datacenters, along with SaaS and cloud services where infrastructure hardware is practically invisible to the customer, enterprises no longer see infrastructure as key to the success of their IT strategies. Rather, enterprises are moving to focus their strategies on digital transformation—the use of mobile, AI, data analytics and other digital technologies, in combination with their data, to optimize operations and create new services that transform how they deliver value. As a result, enterprises are now realizing that IT needs to be centered more on the management and activation of the “information” aspect of information technology (IT)—their data—if they are going to be successful in their digital transformation and other strategic IT efforts. In this new world, data is the coin of the realm, and the winners will be enterprises who can quickly and effectively harness data—the right data —to reduce risk, improve business outcomes and create new value for their customers, employees and other stakeholders. As they work to realize this goal, enterprises are increasingly finding that activating copy, backup, archived and other data located on secondary storage can be just as, if not more, useful for driving digital transformation than the production and original data located on their primary storage. 2018-11-01 00:00:00 Read the full story.  

GTS to Acquire Cantor’s ETF, Market Making Businesses

GTS, a leading electronic market maker across global financial instruments, today announced that it has entered into an agreement to acquire the Exchange Traded Funds (ETF) and wholesale market making businesses of Cantor Fitzgerald (“Cantor”), a global financial services firm. The transaction elevates GTS into the top-tier of global brokers with the potential to touch virtually every investing household in America and across the world. As part of the transaction, approximately 35 Cantor Fitzgerald personnel will become full-time GTS employees. The ETF business will be led by industry pioneer Reginald Browne along with his partners Eric Lichtenstein and Darren Taube. Overseeing the Wholesale Market Making business will be Joe Pleffner, Mike Strashnov and Menash Cohen. While at Cantor, the partners perfected expertise to optimally aggregate and execute institutional and retail trades across their extensive network of global clients. The unique combination of that business with GTS, which on many days trades upwards of six percent of all U.S. trading volume and is the largest Designated Market Maker (DMM) at the New York Stock Exchange (NYSE), ushers in a new era of technology access and innovation that will lower the costs of investing. “For the first time on a scale never seen before, the most sophisticated Wall Street technology is being deployed for mainstream investors, be they institutional or retail,” said Ari Rubenstein, CEO and co-founder of GTS. “Investors around the world can now leverage the very best in machine learning, artificial intelligence and execution technology to help them save money whenever they trade and invest. This is an unprecedented opportunity for investors that unites unrivaled innovation with pioneering client service – while enhancing the capital raising opportunities for listed companies.” 2018-11-02 13:52:24+00:00 Read the full story.  

Why artificial intelligence is ‘like electricity’ for Microsoft

Rico Malvar, Microsoft’s chief scientist, leaves no room for doubt about the importance of artificial intelligence to the tech giant. “It’s almost like electricity,” he says during a session at the Microsoft Research facility at he company’s sprawling campus in Redmond, outside Seattle. But where many think of AI as creating machines that can think and even feel for themselves, the Microsoft view is that AI needs to be included – or in Microsoft speak, “infused” – in everything, from a simple word processor to a quantum computer. It is, in a way, AI writ small. While Microsoft is working on its share of AI moonshots – perhaps most notably an AI-powered project that is trying to develop a universal blood test that would screen for a huge range of diseases – the company’s focus on enterprise and personal productivity that has been set by chief executive Satya Nadella means the company is pushing AI right down to its standard consumer products too. 2018-11-02 00:00:00 Read the full story.  

Which JS Framework Is Best For Big Data Development?

Want to know whether Angular or React is better? Understand the difference between the two right here! Are youworking on big data development? There are a number of frameworks that you can use for JavaScript. Unfortunately, choosing the right framework isn’t always easy. Angular and React are two of the most popular. Which JS framework is best for your project? Should You Use Angular or React for Big Data Development? JavaScript is a front-end language, so big data developers don’t usually give it much consideration. However, it plays an important role in many aspects of the process. If you are a back-end developer, you still need to make sure that your approach is in sync with your front-end counterparts. This includes making sure that you know what JS framework they are using. If you are developing data applications on the front end, this is obviously even more important. The good news is that you can figure out which approach to take by following these guidelines. 2018-11-02 21:09:57+00:00 Read the full story.’s Full Suite of AI Platforms Now Available in the Microsoft Azure Marketplace

According to a new press release, “, the open source leader in artificial intelligence, today announced the availability of its full suite of products – including open source platforms H2O, Sparkling Water and the award-winning automatic machine learning platform, H2O Driverless AI in Microsoft Azure Marketplace, the online store providing applications and services for use on Microsoft Azure. customers can now take advantage of the scalability, high availability, and security of Azure, with streamlined deployment and management. 2018-11-01 00:05:49-08:00 Read the full story.  

Synechron Launches 11 AI Data Science Accelerators For BFSI Sector (Banking, Financial Services and Insurance)

Synechron Inc, the noted global financial services consulting and technology services provider, this week announced the launch of their Artificial Intelligence Data Science Accelerators for the Banking, Financial Services and Insurance (BFSI) firms. These four new solution accelerators will help financial services and insurance firms solve complex business challenges by discovering meaningful relationships between events that impact one another (correlation) and cause a future event to happen (causation). Following the success of Synechron’s AI Automation Programme Neo, Synechron’s AI Data Science experts have developed a powerful set of accelerators that allow financial firms to address business challenges related to investment research generation, predicting the next best action to take with a wealth management client, high-priority customer complaints, and better predicting credit risk related to mortgage lending. 2018-11-02 10:09:49+00:00 Read the full story.  

OutSystems Expands Artificial Intelligence and Machine Learning Initiative with Launch of

Low-code development leader announces new AI-powered features, executive hires, and partnership with Carnegie Mellon University. OutSystems, provider of the number one platform for low-code application development, today announced, a new program that advances the company’s mission to bring the power of artificial intelligence and machine learning (AI/ML) to software development. The project introduces a new domain, names a new head of the OutSystems AI/ML group, and launches new and ongoing research into AI-assisted software development. The project builds on the company’s previously announced “Project Turing” AI initiative. Project Turing, named for the father of theoretical computer science and artificial intelligence Alan Turing, established a new AI Centre of Excellence in Lisbon, committed 20 percent of the company’s overall R&D budget to AI/ML, and developed partnerships with industry experts, tech leaders, and universities to drive original research and innovation. 2018-11-01 00:00:00 Read the full story.  

Web Summit plans VC fund to leverage startup data from its mega conference

From Web Summit’s very inception, conference organizers have utilized a sophisticated data operation to build it into one of the world’s largest technology events. Now the group behind the conference plans to use that same foundation of information to launch a seed stage venture fund. According to a report today in the Financial Times, the Web Summit organization has filed with the U.S. Securities and Exchange Commission to create a $50 million venture capital fund called Amaranthine. “We help some companies in a really huge way for three days,” Web Summit CEO Paddy Cosgrave told the paper. “What about the other 362 days?”. According to the report, Web Summit will leverage the massive amount of information it gathers on the thousands of startups and attendees who come through the event, which kicks off today in Lisbon, Portugal. Organizers just recently signed a deal to keep the conference in Lisbon for another 10 years. Started in Dublin, the event relocated to Lisbon in 2016 after it outgrew its hometown. From a modest start of 400 attendees in 2011, it quickly grew to 22,000 by 2014 and to the more than 60,000 expected this week in Lisbon. The Web Summit’s founders invested early in building a data science team, developing matchmaking algorithms and video capture systems to track movement throughout the event. That has created a treasure trove of information that the new venture fund can use to identify hot startups, according to the Financial Times report. 2018-11-05 00:00:00 Read the full story.  

Data Warehouse Modernization and the Journey to the Cloud

To say that organizations today are facing a complex data landscape is really an understatement. Data exists in on-premises systems and in the cloud; data is used across applications and accessed across departments. Information is being exchanged in ever-growing volumes with customers and business partners. Websites and social media platforms are constantly adding data to the mix. And now there’s even more data coming from new sources such as the Internet of Things (IoT) via sensors and smart, connected devices. This proliferation of data sources is leading to a chaotic, “accidental architecture”, where organizations can’t get the right data to the right people at the right time. That means users such as business analysts and data scientists can’t adequately analyze relevant data and get the most value out of it to enhance the business. 2018-10-30 00:00:00 Read the full story.  

If a business is not data driven, it will not exist within five years: IAPA MD

If your business is not data driven within five years you will not have a business, according to Annette Slunjski, Managing Director of the Institute of Analytics Professionals of Australia. She said, “We’re not going to be in the fourth industrial revolution unless we can start harnessing the data we have and start making better decisions.”. “One of the biggest issues for the country is that there is a cohort of workers who aren’t data literate who are about to have their jobs disrupted by automation. “They won’t be able to find new ones, unless they have data skills because those are going to be mandatory. Data understanding and data literacy education is an organisational imperative to be part of the emerging data economy.” 2018-11-05 16:00:30+11:00 Read the full story.  

Edge vs cloud computing: which is the best investment?

Proactive investing requires making choices that often demand a level of conviction about a decision many years ahead of its time. For Australian ultra high net worth (UHNW) investors – many of whom invest in fintech in California, Boston, Berlin or Shanghai and heavy IT infrastructure via private equity, seed or even syndicated capital – a bifurcating choice is being asked of them to either invest further into existing cloud-based infrastructure or its nascent alternative, known as “edge” computing. Decisions need to be made with serious consequences for invested capital. With the Internet of Things (IoT) expanding from a base of 980 million devices in 2018 to an estimated 5.6 billion devices by 2020, edge computing, in essence, allows data produced by IoT devices to be processed closer to where they have been created instead of sending its data across long routes to data centres or clouds in the United States, northern Europe or even Siberia. 2018-10-31 00:00:00 Read the full story.  

AI And Automation Aren’t Quick Wins — Invest Anyway

Automation, AI, and robotics have risen to the top of the CIO agenda, but the road to realizing business value with these technologies is long and winding. Organizations that succeed with these technologies make numerous investments in prerequisites, which Forrester encapsulates in a model called RQ, the robotics quotient. We’ve just released a new report to help CIOs navigate through the challenges. In “Automation, AI, And Robotics Aren’t Quick Wins,” we find several common situations that enterprises find themselves in today:
  •  Over-investing in moonshots, leading to failure.
  • Focusing on narrow, well-specified problems, heightening chances of success.
  • Addressing real-time, insights-driven, customer-obsessed actions, with a chance for success.
2018-11-01 09:49:10-05:00 Read the full story.  

How retail digitalization is driving date centre expansion

A new report from Vertiv has revealed how digital transformation has impacted the data centre requirements of major international retailers. The rapid evolution of digitalization has left consumers expecting an enhanced and flawless service when interacting across retail channels, prompting retailers to rush to leverage IoT, cloud, and big data to get ahead of competitors and deliver unique customer experiences. To investigate the impact of these initiatives on the infrastructure supporting retail, Vertiv partnered with Data Center Dynamics to interview executives from 50 major retailers with a combined annual revenue of $953 billion. Survey respondents were responsible for 420 in-house data centres and 522 smaller distributed data centres, spanning 3.5 square feet and with 473mw combined power capacity. 2018-11-01 00:00:00 Read the full story.  

Data Warehouses and GPUs: Big Data at High Speed

“Three years ago it was tough to tell the market that they should put a Data Warehouse on top of something that runs on top of GPUs,” said Ami Gal, CEO and co-founder of SQream. “Now it’s clear that GPUs are storming A.I., Machine Learning, and data centers,” and GPU technology has become an accepted way to run queries on massive datasets. GPU is an acronym for Graphics Processing Unit, first designed by Nvidia to speed up the production of graphics and video for gaming in 1999. Shortly thereafter, Gal wondered if it was possible to put that superior speed to use running a database. “We thought it would be cool if we could effectively run a SQL query on a GPU,” something that was considered impossible at the time. “And when you say to an Israeli engineer, ‘Oh, it’s impossible,’ you only give him motivation to work on it.” 2018-10-31 00:35:40-08:00 Read the full story.  

Big Data and Blockchain are Riding the Hype Wave, Why?

Managing massive amounts of numbers, and related data, has become a core part of every sector – from business admin to local/ central government agencies. With the help of Big Data and Data Science, harvesting and managing data has become easier, helping organizations run much more successfullly. However, there was a time when Data Scientists were unable to share, secure and authenticate data integrity. But with the help of Blockchain technology, tables seem to now be turned. As a result, this open-source and transparent network, which is secured by robust cryptographical calculations, has received the attentive eyes of data specialists. Often referred as Big Data Analytics, Data Analytics provides clear insights that can be easily missed with a human eye. With the help of linear regression, logistic regression, pattern recognition and other sophisticated mathematical techniques, the tech, in particular, has turned out to be a revolution like never before. Speech recognition, self-driving cars, spam identification are some of the real worlds used cases to take into account. Apart from this, the main objective of using Big Data is to come up with autonomous machines that can function without any human intervention. Adding Blockchain is like adding another layer to the Big Data Analytics procedure. The layer mainly includes:
  • Blockchain-generated Big Data which is secure. This also means that it cannot be forged due to the network architecture.
  • Another aspect is being valuable, the Blockchain-generated Big Data is more structured, abundant and complete, making it a perfect source for further analysis.
2018-10-31 00:30:22-08:00 Read the full story.  

Microsoft Research and Cambridge University strengthen their commitment to AI innovation and the field’s future leaders

Microsoft is partnering with the University of Cambridge to boost the number of AI researchers in the UK and help them change the world for the better. The Microsoft Research-Cambridge University Machine Learning Initiative will provide support for Ph.D. students at the world-leading university, and offer a postdoctoral research position at Microsoft Research Lab, Cambridge . Our aim is to realise artificial intelligence’s potential in enhancing the human experience and to nurture the next generation of researchers and talent in the field. Microsoft is working hard on the biggest challenges in the field of AI so we can develop tools that will change lives and organizations across the world. Skype Translator has helped people communicate in many languages and countries. After 10 years of research we’ve released Infer.NET, a cross-platform framework for model-based machine learning that has been used in asthma research and gene analysis, among many other areas. The parts of society that will benefit most from machine learning will need sophisticated solutions that reflect the complexities of the world in which we live. Such intelligent infrastructure has the potential to support decision-making in numerous fields, including healthcare, education, transport, urban planning and agriculture. However, Microsoft cannot make the necessary advancements alone, nor should we do so at the expense of academia. 2018-10-31 00:00:00 Read the full story.  

UK companies at risk of falling behind due to a lack of AI strategy, Microsoft research reveals

Artificial intelligence is changing the UK so fast that nearly half of bosses believe their business model won’t exist by 2023, a new Microsoft report has revealed. The research – entitled Maximising the AI Opportunity – found that this country has a unique opportunity to lead the world in the development and use of AI but only if companies act quickly to embrace it. While 41% of business leaders believe they will have to dramatically change the way they work within the next five years, more than half (51%) do not have an AI strategy in place to address those challenges. Clare Barclay, Chief Operating Officer of Microsoft UK, said: “AI represents a huge opportunity, but only if UK organisations embrace its application in the right way. AI is not about making UK businesses leaner, it’s about how we use the technology to make them stronger. In doing so, we can make our work more meaningful and boost UK competitiveness.” 2018-10-31 00:00:00 Read the full story.  

Data Radically Shifts the Way Oil & Gas Industry Approaches Operational Efficiency

With reducing prices in the energy sector, oil and gas companies are required to focus more on bringing in higher operational efficiency. For this, predictions are made that Data Management and analytics will lead the digital journey for the industry. Capital investments in energy projects have doubled since the year 2000, and are likely to grow $2 trillion annually by the year 2035, so accurate cost predictions as against the benefits is mandatory. If we look a few years back, oil and gas industry was an early adopter of digital transformation. Approximately 50 years back, it was the first industry to use digital distributed systems (DDS) to take charge of refineries and other downstream plants. Then came in the digital oil field concept, and made it a leader in adopting digital representatives of seismic data for representing deposits and reserves. The challenge for the industry now is to implement the next wave of Digital Transformation and radically shift the way it approaches operational efficiency with help of data and analytics. 2018-11-01 00:30:21-08:00 Read the full story.  

Scoop Up AI’s Benefits, Skip its Risks and Pitfalls

It seems everyone is awash in enthusiasm for Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL). Indeed, there’s a lot of business advantage, both tactical and strategic, to be had from the technology. But there’s potential business risk as well, and a lot of factors that can challenge successful adoption. Are these insurmountable? No. But accounting for these factors ahead of time can save a lot of grief. In this article, I’ll point out some possible AI pitfalls to help you prepare for them, and I’ll provide advice on how to avoid them altogether. The biggest challenge with AI is that it’s hard for a single organization to implement on its own, and individuals possessing the rarefied skill set required to do it are in short supply. Data Scientists, as these specialists are often known, need a cocktail of statistical, database, cognitive science and business domain expertise. These skills are each specialized enough on their own; requiring the combination of them constitutes a very tall order.
  • The Data Scientists
  • Tools Need Sharpening
  • Great Expectations
  • The Bright Side
2018-11-05 00:35:11-08:00 Read the full story.  

Data Science and the Trading Desk

Francis Bacon, René Descartes and Isaac Newton were among pioneers who advanced the idea of making conclusions based on observation and evidence, rather than just reasoning. Centuries later, institutional brokers are incorporating tenets of the scientific method into their own pursuits of buying and selling blocks of equity. The nutshell premise is that data and proof walk, conjecture talks. This is especially the case in a rapidly evolving market with a multitude of promising — but untested — trading options. “At UBS in the Americas our view is that the equity ecosystem continues to evolve and become increasingly complex in terms of new order types, new venues and new sources of liquidity,” said Todd Lopez, Head of Americas Cash Equities at investment bank UBS. “There continues to be more competition and diversity in liquidity sources. To effectively navigate this environment we need to understand in forensic detail when and how to access these sources and leverage new order types.” Sell-side trading desks utilizing data isn’t new. What is new is the level of sophistication of buy-side investment managers, who need to see evidence that a methodology works. Brokers need to show, not just tell. “Our clients are becoming increasingly sophisticated in how they measure results and are pushing us harder to optimize our capabilities to solve their specific use cases,” Lopez said. “They require empirical evidence that taking a particular approach will result in lower implementation costs of trading.” 2018-11-02 17:01:54+00:00 Read the full story.  

The Future of Active Fund Management? It’s Vinyl

Active fund managers face an existential crisis. Study after study suggests too few of them beat the benchmarks they track. The flood of money into low-cost index tracking products is evidence that investors are losing faith in their ability to outperform and are increasingly unwilling to pay for that lack of alpha generation. So how will the industry look in a decade’s time? As the Global Director of Content at the CFA Institute, the body that oversees the coveted Chartered Financial Analyst qualification, Jason Voss has spent the past few years pondering the industry’s fate. I caught up with Voss, who left the group last month to become an independent consultant, by telephone from Sarasota, Florida, earlier this week. Following is a lightly edited transcript of our conversation… 2018-11-02 12:45:30-04:00 Read the full story.  

AI-Enabled Predictive Maintenance Can Make Vehicle Recalls a Thing of the Past

Vehicle recalls continue to make headlines, costing car manufacturers millions of pounds and great reputational damage. This is all despite the fact that the automotive sector is one of the industries where manufacturing processes are most stringent in terms of quality checks, regular maintenance and monitoring. With technology turning vehicles into platforms of innovation, through leveraging security, efficiency and computing power performance, it’s time for manufacturers to become “smarter” when it comes to deploying the right technologies throughout the production and post-purchase lifecycle. This is where AI, machine learning and predictive analytics can enable manufacturers to gain full visibility and control of manufacturing processes. They say that the best time to correct an error is yesterday, and that is true in the automotive sector. While AI and machine learning cannot—yet—turn back time, cognitive technologies can analyze data in ways that were previously unattainable. 2018-11-01 00:00:00 Read the full story.  

Dynamics 365 AI for Customer Service is now available in public preview

Microsoft are excited to announce Dynamics 365 AI for Customer Service Insights is now available in public preview, giving businesses access to artificial intelligence powered insights on their customer service data. Using our natural language understanding technology, we are able to automatically group cases by support topics, all without the need for any manual tagging of cases. Thanks to this clustering, new insights become available, such as automatic identification of quickly growing support topics before they have reached overwhelming volume. Thanks to the power of AI, your support team can efficiently identify opportunities for providing better customer experiences. Built-in dashboards, interactive charts, and visual filters provide views into support operations data across channels, and highlight areas for improvement that can have the greatest impact, helping you quickly evaluate and respond to key performance indicators (KPIs) and customer satisfaction levels. 2018-10-31 00:00:00 Read the full story.  

Movie Recommendations with Spark Collaborative Filtering

Collaborative filtering (CF)[1] based on the alternating least squares (ALS) technique[2] is another algorithm used to generate recommendations. It produces automatic predictions (filtering) about the interests of a user by collecting preferences from many other users (collaborating). The underlying assumption of the collaborative filtering approach is that if a person A has the same opinion as a person B on an issue, A is more likely to have B’s opinion on a different issue than a randomly chosen person. This algorithm gained a lot of traction in the data science community after it was used by the team winner of the Netflix Prize. 2018-11-02 00:00:00 Read the full story.  

Inference Engine Aimed at AI Edge Apps

Flex Logix, the embedded FPGA specialist, has shifted gears by applying its proprietary interconnect technology to launch an inference engine that boosts neural inferencing capacity at the network edge while reducing DRAM bandwidth requirements. Instead, the inferencing engine draws greater processing bandwidth from less expensive and lower-power SRAMs. That inference approach is also touted as a better way to load the neural weights used for deep learning. Unlike current CPU, GPU and Tensor-based processors that use programmable software interconnects, the Flex Logix approach leverages its embedded FPGA architecture to provide faster programmable hardware interconnects that require lower memory bandwidth. That, the chip maker said, reduces DRAM bandwidth requirements—and fewer DRAMS translates to lower cost and less power for edge applications. 2018-11-01 00:00:00 Read the full story.  

IoT Is The Most Important Development Of The 21st Century

The internet of things refers to a network of physical devices, automobiles, home appliances and all those items that are used in conjunction with actuators, electronics, sensors, software and connectivity to enhance connection, collection and data exchange. The IoT provides a platform that creates opportunities for people to connect these devices and control them with big data technology, which in return will promoteefficiency in performance, economic benefits and minimize the need for human involvement. It’s the most important development of the 21st century. IoT involves the extension of internet connectivity beyond personal computers and mobile devices. It can reach a wide range of non-internet enabled devices. Once the devices have been embedded with technology, they are brought to life and can communicate with each other through the internet. This means they can be monitored and controlled remotely. For instance, the rise of autonomous driverless cars has become more feasible because of IoT implementation. 2018-10-29 17:08:30+00:00 Read the full story.  

Picking Sides in the Customer 360 War

Don’t look now but a war is shaping up among software giants to win the hearts and minds of consumers with big data and next-generation customer 360 analytics. Three separate groups, including Oracle CX Unity, the Open Data Initiative, and Salesforce’s Customer 360, have been created in the past month to build data-sharing ecosystems that streamline the gathering and analysis of customer data for sales and marketing purposes. Customer 360 is not a new concept. Large corporations have been chasing – and finding some degrees of success with – the notion of centralizing all data about their customers as a way to drive greater understanding of their wants, needs, and concerns. These companies typically built customer 360 systems atop large MPP data warehouses that housed data originating from various applications, including ERP, CRM, and POS systems. 2018-10-31 00:00:00 Read the full story.  

Manchester University’s new £15m supercomputer could unlock secrets of human brain

A”human brain” supercomputer with 1 million processors has been switched on by British scientists for the first time. Built by Manchester University, the £15m “SpiNNAker” machine is able to complete more 200 million actions per second and has 100 million moving parts. Its creators hope that it will be able to “unlock some of the secrets of how the human brain works”. Unlike traditional computers, it doesn’t communicate by sending large amounts of information from point A to B. Instead it mimics the communication architecture of the brain, sending billions of pulses – small amounts of information – simultaneously to thousands of different destinations. Scientists have simulated a region of the brain called the Basal Ganglia, an area affected in Parkinson’s disease, raising hopes that it may have potential for neurological breakthroughs in future pharmaceutical testing. 2018-11-02 00:00:00 Read the full story.  

Toby Cosgrove, former Cleveland Clinic CEO turned Google advisor

Toby Cosgrove has a new job. After decades running one of the world’s most famous hospitals, Cleveland Clinic, he’s helping one of Silicon Valley’s most prominent technology companies figure out how to sell its technology and services into health care. Cosgrove announced earlier this year that he’s taking a position at Google as an advisor to the Google Cloud health care and life sciences team. Surprisingly, he’s not fighting to win large cloud contracts at the largest hospitals. That’s because most hospitals have already invested hundreds of millions of dollars into on-premises IT systems from companies like Epic Systems and Cerner, including installation, upgrades and training. At Cleveland Clinic, for instance, Cosgrove was one of the first big customers to invest in Epic Systems’ electronic medical record software. Cosgrove doesn’t see these companies moving their entire infrastructures to a hosted cloud system like Google Cloud, or the more popular rival products from Amazon and Microsoft. Instead, he wants to figure out what kinds of apps can be built on top of these systems to help hospitals start to modernize. 2018-11-04 00:00:00 Read the full story.  

How AI makes in-app ad creatives better

Why do some mobile in-app ad campaigns succeed, while others fall flat? In part, it’s because the creatives used are just ineffective. Too often, ads go unseen — and unclicked. In the second quarter of 2018, Moat’s average valid and viewable rate was around 60 percent, while the average viewable rate noted by IAS in the same time frame was less than 50 percent. That means two of every five ads will never be seen in full by a real person. The average click-through rate of an in-app ad is just over 1.5 percent. While it’s better than the average CTR for mobile web ads (1.12 percent), it still means that a lot of ads are not leading to sales, app downloads, and sign-ups. So, is there a better way? What can advertisers do to make sure their ad spend yields real results? This is one application where artificial intelligence (AI) and machine learning (ML) can help in a major way. By applying advanced analytical insights to the art of ad creatives, mobile marketers can be sure their ad campaigns are more appealing and thus more effective. 2018-11-05 00:00:00 Read the full story.  

How a German Manufacturing Company Set Up Its Analytics Lab

Over the past few years, most businesses have come to recognize that the ability to collect and analyze the data they generate has become a key source of competitive advantage. ZF, a global automotive supplier based in Germany, was no exception. Digital startups had begun producing virtual products that ZF did not know how to compete against, and engineers in logistics, operations, and other functions were finding that their traditional approaches couldn’t handle the complex issues they faced. Some company executives had begun to fear they were in for their own “Kodak moment” – a fatal disruption that could redefine their business and eliminate overnight advantages accumulated over decades. With automotive analysts forecasting major changes ahead in mobility, they began to think that the firm needed a dedicated lab that focused entirely on data challenges. 2018-11-02 14:00:42+00:00 Read the full story.  

Intel launches Xeon E-2100 series, previews Cascade Lake

Intel revealed in August that its next-generation Xeon processors would launch in fall 2018, and today the company made good on that promise. The chipmaker debuted two new additions to its data-centric product lineup — the Xeon E-2100 series and Cascade Lake advanced performance — and provided an update on its broader momentum. “There’s an exponential growth cycle in data driven by the push to the edge, the massive personalization of services, and the insatiable demand for new capability … [but] to the best of our knowledge … only between 1 and 2 percent of that data is being used and analyzed,” Lisa Spelman, vice president and general manager of Intel Xeon products and datacenter marketing, said during a conference call with reporters. “The addition of Cascade Lake advanced performance CPUs and Xeon E-2100 processors to our Intel Xeon processor lineup … demonstrates our commitment to delivering performance-optimized solutions to a wide range of customers.” Intel’s pitching its forthcoming Cascade Lake advanced performance as a “new class” of Xeon Scalable Processors — one focused squarely on high-performance computing (HPC), artificial intelligence (AI), and infrastructure-as-a-service (IaaS) workloads. It’s a multichip Cascade Lake-based package comprising two sockets with a high-speed Ultra Path Interconnect, delivering a combined 48 cores per CPU and 12 DDR4 memory channels. 2018-11-04 00:00:00 Read the full story.  

5 Critical Steps to Predictive Business Analysis – Towards Data Science

As a Data Engineer, when I am solving a problem, I often ask myself what if there was no data for this problem? What if I had to make a design change where I had no clue how the market/users would react to it? Is there a more reliable method for decision-making? This post will walk you through five critical steps to design a robust and conclusive experiment for A/B testing. Following are the major points which we’ll be covering:
  • Introduction to A/B testing.
  • Choosing the right business metrics.
  • Statistical review.
  • Designing the experiment.
  • Analyze
There are many A/B testing software present in the market. In this blog, I won’t be focussing on any particular tool for A/B testing, but instead, on the working of it. I’d be discussing a statistical approach to test if a new landing page of an e-tailer would help them increase user engagement on the platform. 2018-11-05 13:18:20.946000+00:00 Read the full story.  

The Next Big Barrier Facing Artificial Intelligence: Common Sense

When armed with the right model, a machine-learning platform can rapidly improve its accuracy and success rate. Just look at Google’s A.I. tool that can detect the most common forms of lung cancer with 97 percent accuracy, or self-driving cars that travel for thousands of miles without so much as a fender-bender. With the incredible amount of resources and brainpower devoted to machine learning and A.I., it’s inevitable that these platforms will only get “smarter” in the years to come. However, there’s just one little problem: A.I. lacks what we call “common sense.” DARPA—that’s the agency of the Department of Defense (DoD) that researches and prototypes all kinds of crazy inventions—wants to build common sense into A.I., presumably so that future military robots don’t accidentally tumble off cliffs or into walls. DARPA’s Machine Common Sense (MCS) program will host a competition to come up with solutions. 2018-10-31 00:00:00 Read the full story.  

Microsoft CEO Satya Nadella: Tech companies need to defend privacy as a human right

Microsoft CEO Satya Nadella voiced his support for privacy as a “human right” and called on tech companies to protect users from cyber threats in a keynote address Thursday. Speaking at the Microsoft Future Decoded conference in London, Nadella said the tech industry and governments need to collectively consider the unintended consequences of every business becoming a digital company. He highlighted three major considerations: privacy, cybersecurity and AI (artificial intelligence) ethics. “All of us will have to think about the digital experiences we create to really treat privacy as a human right,” Nadella said. Nadella praised Europe’s new General Data Protection Regulation (GDPR) that sets stringent privacy standards for any company with business in the EU. Apple CEO Tim Cook also recently applauded the law, calling for similar federal privacy regulation in the U.S. “GDPR as a piece of legislation, a piece of regulation is a great start,” Nadella said. “We think about it as something that sets the standard, the bar, for how people need to think about privacy worldwide.” Tech CEOs like Nadella and Cook are getting increasingly vocal about their support for regulation amid ongoing data privacy concerns. In a speech in Brussels last week, Cook blasted tech companies saying personal information is being “weaponized against us with military efficiency.” 2018-11-01 00:00:00 Read the full story.  

Will Big Data Change The Future Of Franchises Forever?

Big data is having a profound impact on business models that have been around for decades. Franchises are no exception. Corey O’Donnell, the vice president of Yodle, addressed this in a column at Franchise Magazine. O’Donnell astutely points out that big data has given franchises the opportunity to optimize their business models much more effectively. “Using data to determine the best way forward is the difference between making a well-informed decision and taking a leap of faith. This applies to a franchise network as much as any other business. Armed with data, we have the information needed to understand the past, interpret the present and predict the future. If data-informed decisions are better decisions, and bigger is better, then logic says that “Big Data” enables us to make the best possible choices.” 2018-11-01 09:30:47+00:00 Read the full story.  

Using Machine Learning To Develop Wireframes For Your Mobile Apps

The term big data had already been coined back in 2010 when I wasworking in digital marketing and would see how strongly big data and deep learning impacted the mobile development profession. One of the biggest changes is the role of machine learning. Daryna P., a copywriter with RubyGarage, states that the applications of machine learning are virtually endless. Daryna alludes to a study from SalesForce showing that 57% of customers are willing to share their data with companies that plan to use it to make their experience better. However, collecting customer data won’t do you any good if you don’t put it into practice. You need to develop a well-thought out wireframe and know what types of data to collect to implement it successfully. Here’s what to know about using machine learning to develop wireframes. 2018-11-02 21:16:06+00:00 Read the full story.  

Cray Unveils Shasta, Its First Exascale Supercomputer

Cray officials are unveiling the company’s first exascale-capable supercomputer, a system named Shasta that is designed to give organizations choices in compute and networking technologies and a single system to run such increasingly complex workloads as artificial intelligence, analytics, modeling and simulations. The supercomputer is aimed at simplifying computing for modern workloads that typically run on heterogeneous cluster systems, which Cray officials argue are becoming too complex. There is a growing demand for single systems to run disparate workloads and workflows at the same time, improving manageability, removing performance bottlenecks and enabling organizations to run these workloads at scale. 2018-10-30 00:00:00 Read the full story.  

Korean AI startup Skelter Labs lands strategic investment to expand to Southeast Asia

Korean AI startup Skelter Labs is expanding to Southeast Asia after it pulled in undisclosed funding from Singapore-based VC firm Golden Gate Ventures. Skelter Labs was founded in 2015 by founded by Ted Cho, the former engineering site director at Google Korea. It started out developing apps and services that made use of AI but then it pivoted to focus fully on AI tech, which it licenses out to companies and corporations that it works with. Now it is eying opportunities in Japan and parts of Southeast Asia — which has a cumulative population of over 600 million — with Vietnam, Thailand and Malaysia specifically mentioned. The startup raised a $9 million seed round earlier this year, and Golden Gate has added an additional check to that round which came from KakaoBrain — the AI unit of Korean messaging giant Kakao — Kakao’s K-Cute venture arm, Stonebridge Ventures and Lotte Homeshopping, the TV and internet shopping business owned by multi-billion dollar retail giant Lotte. More specifically, Seoul-based Skelter Labs works on AI in the context of vision and speech, conversation, and context recognition, while it goes after customers in areas that include manufacturing, customer operations, device interaction, and consumer marketing. 2018-11-04 00:00:00 Read the full story.  

Freedom on the Net 2018: The Rise of Digital Authoritarianism

The internet is growing less free around the world, and democracy itself is withering under its influence. Disinformation and propaganda disseminated online have poisoned the public sphere. The unbridled collection of personal data has broken down traditional notions of privacy. And a cohort of countries is moving toward digital authoritarianism by embracing the Chinese model of extensive censorship and automated surveillance systems. As a result of these trends, global internet freedom declined for the eighth consecutive year in 2018. Events this year have confirmed that the internet can be used to disrupt democracies as surely as it can destabilize dictatorships. 2018-10-30 22:08:33-04:00 Read the full story.  

From fighting Alzheimer’s to AR captions, UW computer science students show cutting-edge innovations

As the University of Washington’s computer science program has grown, so too has the breadth of problems that its students are trying to solve. That variety was on full display Thursday evening on campus, as projects focused on healthcare, cloud computing, augmented and virtual reality and much more were honored. A project called Embarker, which focuses on identifying genes that can be used as markers that could predict Alzheimer’s Disease, took home the 13th annual Madrona Prize for the project with the most commercial potential at the Paul G. Allen School of Computer Science and Engineering’s Poster and Demo Session as part of the 2018 Industry Affiliates Annual Research Day Thursday night. 2018-11-02 19:00:16-07:00 Read the full story.  

It’s become increasingly clear that Alphabet, Google’s parent company, needs new leadership

When they walked out of their jobs by the thousands on Thursday, Google employees did more than bring attention to the company’s handling of sexual harassment allegations. They put an intense public spotlight on the acute shortcomings of the company’s top brass. Executives at Google, and its parent company Alphabet, didn’t just let gender discrimination fester at the tech giant. In some cases, they enabled it. And then, when employee rage over what they’d done sparked the largest protest ever seen at the company — and perhaps at any tech firm ever — top managers were nowhere to be seen. Google CEO Sundar Pichai was attending a tech conference in New York, on the other side of the country from the company’s headquarters, which represented the epicenter of the walkout. And Alphabet CEO Larry was … well, who knows? 2018-11-03 00:00:00 Read the full story.  

Interview: ‘The Formula’ by Albert-László Barabás explains the science behind who will succeed or fail

Do you find yourself struggling — whether in your startup, day job, or life — and wondering, why does it seem like I’m not getting anywhere? Maybe it’s not just you. Despite our American notion of hard work and bootstrapping, it turns out a lot more goes into determining our success than we think — like everyone else around us. In The Formula: The Universal Laws of Success, Albert-László Barabási tackles the question of who is most likely to rise to the top, and who won’t, by applying his work at Northeastern University’s Center for Complex Network Research, analyzing years of data across multiple fields. Barabási’s five universal laws for success know no bounds — they apply to every field — and can explain the success of everything and everyone, from Exploding Kittens to Einstein. Barabási even turned the data for success into “formulae we could use to predict future outcomes for ourselves, our colleagues, and even our professional rivals.” 2018-11-03 19:00:10-07:00 Read the full story.  

Business Does Not Need the Humanities — But Humans Do

Sometimes a simple story is all it takes to capture complex issues, or so it seems. Take this one. A few years ago, Facebook CEO Mark Zuckerberg lost a game of Scrabble to a friend’s teenage daughter. “Before they played a second game, he wrote a simple computer program that would look up his letters in the dictionary so that he could choose from all possible words,” wrote New Yorker reporter Evan Osnos. As the girl told it to Osnos, “During the game in which I was playing the program, everyone around us was taking sides: Team Human and Team Machine.” The anecdote was too delicious to ignore, seeming to capture all we (think we) know about Zuckerberg—his casual brilliance, his intense competitiveness, his hyper-rational faith in technology, and the polarizing effect of his compelling software. It went viral. The story was popular because it easily reads as an allegory: the hacker in chief determined to find a technical solution to every problem, even far more complex ones than Scrabble—fake news, polarization, alienation. “I found Zuckerberg straining, not always coherently, to grasp problems for which he was plainly unprepared,” Osnos concluded after speaking to Zuckerberg extensively about his role in shifting public discourse worldwide. “These are not technical puzzles to be cracked in the middle of the night but some of the subtlest aspects of human affairs, including the meaning of truth, the limits of free speech, and the origins of violence.” 2018-11-02 16:00:19+00:00 Read the full story.  
This news clip post is produced algorithmically based upon CloudQuant’s list of sites and focus items we find interesting. If you would like to add your blog or website to our search crawler, please email We welcome all contributors. This news clip and any CloudQuant comment is for information and illustrative purposes only. It is not, and should not be regarded as “investment advice” or as a “recommendation” regarding a course of action. This information is provided with the understanding that CloudQuant is not acting in a fiduciary or advisory capacity under any contract with you, or any applicable law or regulation. You are responsible to make your own independent decision with respect to any course of action based on the content of this post.