Monthly Archives: December 2016

The new cognitive advertisement : the brand utility

Reading Time: < 1 minute

The fields of application of artificial intelligence are varied and today concern advertising, from experiments carried out by IBM, through its division Watson Ads.

The objective of the group, is to rely on artificial home intelligence, Watson, to give birth to the first “cognitive advertisements”, able to answer questions formulated in natural language. Because we are never better served than by ourselves, IBM tests the technology in a recently acquired editorial universe, that of the weather site Weather.com and its mobile application.

Based on the location of the device, the surrounding weather and the ingredients that the user says using his microphone or keyboard, the advertising interface will be able to offer, in real time, cooking recipes in partnership with the soup brand.The “Chef Watson” picks up the brand’s collection of recipes and displays the most appropriate dynamically within the relevant advertising insert. “Using data to deliver quick, easy meal ideas in real time is exactly the kind of experience we want to offer our consumers,” says Campbell Soups Media Manager, Marci Raible. All the challenge of the program: to get out of the “top-down” brand message and to offer an unprecedented experience to the surfer.

References:

http://www.adweek.com/news/technology/ibm-watson-and-weather-company-are-ready-launch-their-first-cogntive-ads-173727

Driving a Tesla Car with Mind-Controlling

Reading Time: 2 minutes

The Tesla Model S has gone few meters, in a straight line between two spots in a parking garage. The interesting thing was that there was no driver behind the wheel. There was a person in the passenger’s seat who put a EEG headset which allowed him to control the car with his mind. Let me introduce the `Teslapathic car` to you.

teslapathic-2

A team of successful scientists located in California used 2015 Tesla Model S 85D for a project to be placed in a Hackers event. The team of scientists used EEG headsets. EEG which is `Electroencephalography` is an electrophysiological monitoring method to record electrical activity of the brain. According to team`s project description, those EEG headsets are put on the driver`s head and it translates the brain activity for “stop” or “go” into analog signals by an off-the-shelf RC radio and articulated actuators on the pedals and a motor on the steering wheel. In this point, a machine learning program converts the brain reactions into commands.

tesla-2

Developers of this project creates special gestures for it. For example, to start the engine and move the car, the passenger in the backseat with EEG headset tap his right foot. Additionally, if he wants to stop the car, he clenches his left hand. Actually, the driver sends analog signals to his brain which converts those to commands connected to the car electronically.

Moving and stopping are developed as brain controlled but what about steering? Well, steering was somewhat bulky, and unfortunately not brain-controlled. The team built in a windshield wiper motor fitted with a potentiometer on the steering wheel. A head-mounted gyro for the driver allowed some steering ability thus when the driver moved his head right or left, the steering wheel reacted accordingly.

The developer team also planned about safety of the driver. While developing the tool for mind-controlling, they created an emergency break code, in the event of a failure. So the driver should hold a dead man’s switch to send a signal to stop the car.

Maybe mind control will have a future in the automotive industry?

 

References:

  1. https://electrek.co/2016/11/16/tesla-model-s-mind-control/
  2. http://www.seeker.com/hackers-turn-tesla-into-a-brain-controlled-car-2105181698.html
  3. https://en.wikipedia.org/wiki/Electroencephalography

Welcome to Amazon go !

Reading Time: < 1 minute

There we are, Amazon did it : the  e-marketplace has just opened its first supermarket in Seattle (United States). Called Amazon Go, the store has the particularity of not having check-out, all going through a mobile app associated with an Amazon account. Unfortunately,  this store is only in test for now and is just opened for Amazon’s employees. It should be open to the public at the beginning of the year. Moreover, if tests are successful, Amazon plans to open more than 2,000 grocery stores, the Wall Street Journal reported on Monday, citing sources.

To enjoy Amazon Go, people will need to install the application and scan the barcode displayed on the screen before entering. Afterwards, they could store their smartphone and they will only have to take the items that interest them. Once their shopping is over, they will simply have to go out of the shop, without having to open their bag. The exit of the user will be detected and, as the user passes through the exit, the user, without having to stop or otherwise be delayed, will be sent the invoice on smartphone, with an automatic payment in the wake.

Called “Just walk out technology”, Amazon remains rather vague about how the store will work. They’re explaining that they are using computer vision, deep learning algorithms, sensor fusion, technologies that can be found in self driven cars. More details will be given next years with the opening of their stores

 

 

 

Stay tuned !

 

References:

http://www.geekwire.com/2016/amazon-go-works-technology-behind-online-retailers-groundbreaking-new-grocery-store/

http://www.news18.com/news/tech/amazon-challenges-supermarkets-opens-line-free-grocery-store-1319789.html

Tagged , , , ,

Have you ever dreamed to sell your home directly from your couch? Now, you can!

Reading Time: 4 minutes

Opendoor is a startup which was founded in San Francisco in 2014. This company has a simple business model – it buys houses in cash, makes some minor repairs, and then resells them at a higher price.

opendoor

This model is unusual in Silicon Valley, where real estate businesses have traditionally focused on creating software to connect homeowners and buyers, not buying and selling buildings. Opendoor takes on risks associated with real estate ownership, hoping to turn a profit on each sale. For sellers, Opendoor offers a quick way to cash out.
The company currently purchases homes only in Las Vegas, Phoenix, and Dallas-Fort Worth, according to its website, but it says it’s using the 210 million dollars (£168 million) it just raised to expand to 10 cities next year.
Then, there is another important fact to consider: two of these areas (Phoenix and Las Vegas) were among the areas most affected by the 2007 US property bubble and in my opinion it is not a case this company is operating successfully there.

Cofounder and CEO Eric Wu’s LinkedIn profile says: “Our mission is to make residential real estate liquid, changing the traditional sales process by making it simple to buy and sell real estate online.” So, company mission is to simplify real estate.
Their main purpose is to simplify the process of selling, which shouldn’t be this hard:

  • strangers walking through you home;
  • months of cleanings and showings;
  • contracts failing through.

Nevertheless it still takes months to sell your home. Therefore, the company has decided for this mantra “With Opendoor, your home is sole the minute you’re ready”.

Opendoor

Furthermore you can read on the website of the company, “We believe in a better home selling and buying experience”.
So they want to focus their attention on better experience and less stress in buying and selling houses process. They want to change the traditional business model in this world, they offer: speed, and customer satisfaction.

Let’s learn something else about this company:

  1. How do they earn money?

    Opendoor relies on a complex algorithm to bid for homes sight unseen, and it can then close on those deals in as little as three days. It makes its money by taking a service fee of 6%, similar to the standard real estate commission, plus an additional fee that varies with its assessment of the riskiness of the transaction and brings the total charge to an average of 8%. It then makes fixes recommended by inspectors and tries to sell the homes for a small premium.

  2. How does the service offered by the company work?

    1 You tell to the company about your home.
    Requesting an offer takes just a few minutes and is completely free.2 Receive an offer on your home.
    The company bases its offer on a comparative market analysis and home’s unique story.

    3 Schedule your home inspection.
    After accepting Opendoor offer, the start-up will do a free, quick home inspection to confirm your home’s condition.

    4 Choose your move-out date.
    Select the closing date you want, from 3 to 60 days from today.

    5 Company will take care of the rest.
    Watch the progress of your sale while Opendoor takes care of all of the details, including arranging escrow.

    6 And you’re done!
    You will receive the proceeds from your home‘s sale on your chosen closing date.

Furthermore, Opendoor hopes to add even more ease of mind for buyers with its new policies. The company will offer its 30-day money-back guarantee to all buyers, for any reason. If they do want to give back the house, so to speak, Opendoor will purchase it back at the price it sold the house, minus the fees associated with the sale, including any escrow fees, title insurance, homeowners association fees, and commissions. Sellers who took out a bank loan to purchase the house will have to pay back the loan to release the house’s title back to Opendoor, said the company.
So, this kind of warranty could be an important source of competitive advantage on its competitors.

But, in any case, it doesn’t matter the success of a company, there are always some risks. A potential problem with these business models is that they are based on the assumption that both rents and property values will continue to rise, which is not an idiotic assumption over the long term, but what if they stop?

In conclusion, certainly the business model is simple and brilliant; probably it could help many people to get quickly big amount of money from the selling of their house but maybe someone could see this company as a hawk which wants to take advantage from people who really needs money because it isn’t a good period of their lives.
In the end, my hope is not going to end as the last financial crisis, namely I hope this company with this simple model is more efficient and more transparent in such a way as to avoid the problems connected to the banks business model in the real estate world.

 

Sources:

https://www.opendoor.com

http://www.forbes.com/sites/amyfeldman/2016/11/30/next-billion-dollar-startup-opendoor-raises-another-210-million-to-expand-its-homebuying-model/#2d556cad2521

http://fortune.com/2016/06/07/opendoor-money-back-gurantee/

http://www.businessinsider.com/startup-opendoor-valued-at-more-than-1-billion-2016-12?IR=T

https://www.bloomberg.com/news/articles/2016-11-30/opendoor-s-home-flipping-business-becomes-the-latest-unicorn-startup

 

 

 

Chronocam – The camera inspired by the human eye

Reading Time: < 1 minute

Created in 2014, Chronocam boasts genuine expertise in artificial vision. Indeed, it was inspired by the functioning of the retina and the human brain to develop an unprecedented approach. Chronocam sensors function as a traditional eye. Instead of taking a series of pictures as a traditional camera, the Chronocam sensors record the motion of the pixels, independently of each other. Only non-static data is saved for processing. CCAM (common classification of medical procedures) sensors can thus capture images at very high speed (100,000 fps) even under difficult light conditions with low information loss due to compression.

Numerous sectors, starting with health, could find an interest: more efficient, more energy efficient and bandwidth, the solution of Chronocam is also a product of choice for the connected objects or even the autonomous cars.

The cameras will have many applications. They could be used in any machine that needs to ‘see.’ Such machines include smart mobile devices, gesture-controlled devices, self-driving cars, robots, drones, or vision restoration devices for the blind.

Also, as said before, the camera is better than traditional ones because it reduces the power consumed (it’s more energy efficient), reduces the amount of data produced (data require much less bandwidth to transmit and much less space to store) and thanks to these less data, the smart machines can make quicker decisions. This is especially important for self-driving cars, which need to make potentially life-or-death decisions in milliseconds. To sum up, our cameras will make devices smarter.

Sources :

https://www.maddyness.com/finance/2016/10/25/chronocam-leve-15-millions-vision-artificielle/

http://www.chronocam.com/about-us/

http://sciencebusiness.net/news/80014/Chronocam-start-up-aims-to-help-machines-see-like-humans

Order your objects by slamming fingers, the future of home automation?

Reading Time: < 1 minute

No more need to move or to tap on his smartphone, lifting a little finger is now sufficient to lead some connected objects.

Start-ups like Bluemint Labs or Thalmic Labs have developed devices capable of detecting movements and for example transmitting commands to flaps, smart lamps and other home automation devices.

Canadian nugget Thalmic Labs, created in 2012 and based in the province of Ontario, created the connected bracelet Myo. It is able to identify the movements of the arm but also of the fingers of its user. It combines to do this spatial data measurement with a gyroscope and an accelerometer with that of the electrical activity of the muscles. This intelligent armband, which resembles a military tank caterpillar, is placed at the top of the forearm near the elbow. From the recorded signals, it allows to control with a simple gesture a smartphone (to make a phone call by making a movement from left to right when in a car for example).

 

References:

https://www.myo.com

MYO Armband provides hands-free controls by reading your arm muscles

A new memory for your digital life

Reading Time: 2 minutes

Atlas Informatics is a startup in Seattle with an audacious goal to redefine search as we know it. Atlas Recall, the company’s first product, gives you a searchable photographic memory that helps you find everything you have seen across all of your devices, apps and cloud services.

This app remembers everything you do on your computer or on all of your computers, and automatically creates a searchable database that can peer into your browser history, your text messages and emails, and just about every major app you can think of, including the Microsoft suite, Slack, most Adobe, Google Docs and Drive, Evernote, Dropbox, and Twitter.

If you are thinking that you‘ve heard of this before you’re most likely thinking of universal search on Apple TV, Spotlight on the Mac and iOS or Google. But you have to remember that Google searches the public internet, Facebook tracks your private photos and friends, Outlook has your contacts, emails, and appointments, Spotlight knows your local files; Spotify has your music and playlists etc. 

Atlas Recall is more like an amalgamation of these different services. While Google can search the indexed web and information from accounts you’ve signed into, it can’t look at documents stored locally on your laptop or iPhone. And although Spotlight and Universal Search trawl your apps, files and even the internet, they can’t pull up a page from your browsing history or make associations with other things you were looking at. Atlas Recall is unique in its ability to sort your results by other events at the same time.

Whit this app you can also look for something based on the time you opened it or what you were doing when you saw it. Search results are laid out visually, with screenshots of each listing organized by file type (images, documents, web pages, etc).

Recall is probably most useful for someone with more than one computer and more than one email address, as well as a job that requires them to be on the go a lot. The most handy feature is the ability to search your computer from an iPhone. So if you’re away from your computer and suddenly need to send someone everything you’ve written, emailed, and searched about say that’s easy to do.

Obviously there are some things you do on a computer that you might not want stored at all. For that reason, you have the option to pause Atlas for various increments of time and rescind its ability to remember what you’re doing.

You can also delete data after the fact, and Atlas promises you have the sole ownership and control of all your information.

Right now Atlas Recall is only available as an open beta on Macs and iOS and a Windows 10 version will be available soon

 

 

 

 

https://www.atlas.co/using-atlas/

http://www.theverge.com/2016/11/30/13779488/atlas-recall-digital-history-app-hands-on-impressions

https://www.engadget.com/2016/11/02/atlas-recall-is-a-cross-platform-search-with-a-big-caveat/

https://techcrunch.com/2016/11/02/atlas-recall-a-search-engine-for-your-entire-digital-life-gets-an-open-beta-and-20m-in-backing/

 

Lost in translation no more? Google’s AI invents own language

Reading Time: 2 minutes

Google Translate has in all of its ten years of existence been a quite useful tool – appreciated by some (especially those who know about the difficulty of machine based translation), ridiculed by others, used by most. It has though never really lived up to professional standards. And, to take that upfront, that hasn’t changed since you last used it half an hour ago. But Google is about to use its main advantage. Its data. Its 140 billion translated words daily.

In order to do that, the ‘machine’ had to be made adaptive. One main obstacle down that road was that, considering 103 supported languages, the idea of creating comprehensive language pairs (meaning direct translation from each one language to possibly every other supported one), was close to impossible. The translation would usually go via English. And thereby any real progress was hampered because even if you managed to improve the result, what you would have improved would not have been the link between the two target languages but between either or both of those and English. As soon as you took English out of the equation, the progress would probably be lost. Translation is a delicate field, if you aim for good results.

What you want to do instead, according to American and German researchers, is creating a so-called neural network. A new common language that may serve as a link between languages that the machine has not been trained to directly translate. If it was able to translate between Hindi and Hebrew and Hebrew and French before, then it can now also translate between Hindi and French without any middle step. Another reason why neural network systems have hailed as the new star in the machine translation universe is that they take into account contextual meaning. Before, and still to an extent because all machine based translation by definition has to be statistical if doesn’t want to be random, a sentence was translated by translating the single words or idioms individually and then putting them together. This is how things like this happen:

55213b421462dd9658aa714da9e8534b

 

This brings us to the main point: Google’s new common language (or any neural network language for that matter), other than English, can be changed, improved, adjusted with the help of the users’ requests.

Which tempts some to the conclusion that for one of the first times, AI, Artificial Intelligence, is actually at play in a user-related field. But that’s a topic for another day …

Tagged , ,