Author Archives: Mukhibbullo Ganiev

AuraRing – tracks your finger’s location

Reading Time: 2 minutes

Although computer-based vision-based hand-tracking is becoming a viable solution for AR and VR headsets, the problem of optical-tracking accuracy remains unfull. AuraRing fits into that. AuraRing is the new electromagnetic tracking system, developed by researchers at Washington University, provides a combination of high-resolution detection and low power consumption to support AR, VR and wider uses for wearables.

AuraRing consists of two pieces. The first is a finger-sized index ring with a wire coil that coils around a 3D printed circle 800 times, using a tiny battery to produce a magnetic field oscillating with a volume just of 2.3 milliwatts. The ring’s five degrees of freedom (DoF) orientation at any given time is determined with 3 sensor coils from a wristband. The machine resolution is 0,1 mm and the dynamic accuracy is 4,4 mm, far better than is obtained by tracking the external camera of the same finger.

With these sensing rates, a finger can be used to write legibly in the air without a contact pad and to provide feedback taps and flip movements that can monitor a screened computer from a distance. Thanks to the implementation of magnetic sensing, researchers suggest that even a visually darkened finger can send text messages, interact with device user interfaces and play games. In fact, AuraRing is designed to work with several finger and hand sizes.

While the illustration of the researchers shows a conceptual wristband, the idea is to simply attach the wrist sensors to a smartwatch or to another computer wearing a wrist, so that the consumer can incorporate finger sensing precision in the handle if appropriate. If you want the additional benefits, just put it on your bracelet, then you will continue to have all of the potentials of today’s smartwatches,’ explained co-leading investigator Farshid Salemi Parizi.

A video showing AuraRing in action shows how it can be used on computer or virtual smartphone displays for reconstruction, handwriting recognition, and input selection. The research was funded by the Reality project, Twitter, Futurewei, and Google of the University of Washington. It is worth noting that Microsoft, headquartered in Redmond, Washington deals with a wider range of sensors for similar purposes on its own responsive smart ring.






Uber is bringing its self-driving cars to Washington, DC

Reading Time: 2 minutes

Uber’s self-driving cars will be in the streets of Washington, DC, shortly after, announcing that the driving firm will gather data to support its autonomous vehicle fleet growth. Nevertheless, the engines are not self-operating. Alternatively, it will be driven by human drivers to launch, gather mapping data and catch driving scenarios replicated by Uber’s engineers.

Moreover, the firm is hoping that its self-drive cars will eventually drive themselves on their own in Seattle. The Company’s Advanced Technologies Group said in a medium-sized announcement, “We expect that this first hand-driven collection of data lays the groundwork for our car self-driving trials in Washington, DC.” “Whilst we are excited about the prospects, we continue to make sure every mile on public roads leads to our role as development professionals through healthy and effective learning.”

Since the fatal crash in TEMP in Arizona that took place in March 2018, Uber conducted the self-drive experiments with plenty of caution. Elaine Herzberg, 49, was struck and killed when she was riding her bike across the street by the car, which had only a safety driver behind her wheel.

Later, the police said the safety guard did not look at the road but instead streamed The Voice on her phone when the crash happened. In a blistering official report that also blamed the federal government for the inability of the company to regulate adequately, the National Transportation Safety Board divided the responsibility between Uber, the safety driver, the perpetrator and the state of Arizona. Local authorities have excluded the organization from alleged misdeeds. Uber settled an undisclosed dispute with the Herzberg estate.

Nine months after the crash testing was officially resumed, with Volvo’s SUV’s working in a locked loop at the center of Pittsburgh, with its head office in Uber’s Advanced Technologies Group. Across three major markets including San Francisco, Houston, and Toronto, Uber collects data as well. Nevertheless, in any of those cities other than Pittsburgh, it is still important to allow autonomous training. The enterprise recently introduced its third-generation vehicle, to be tested this year.

Uber is not the only organization in our nation’s capital that operates self-driving cars. Since 2018, Argo has been testing its cars in DC, the AV startup backed by Ford and Volkswagen. Optimus Ride, based in Boston, is also operating in northern Virginia with a small float of autonomous shuttles.



An electric car from Sony may soon be on its way

Reading Time: 2 minutes

The next big step for smartphone makers has entered the TV market over the last several years. Xiaomi does this so do Apple, Htc and OnePlus. Sony appears to have taken a turning point, however, by introducing its first hybrid car model, shocking everybody. The announcement was made at the upcoming CES in Las Vegas in 2020.

The Sony Vision-S car (not the official name but its construction platform) is an electric sedan which has 33 sensors inside or outside. This car is a hybrid model. Many broadcast screens, adapters, and 360 audio systems are always available.

The car was built by Sony’s AI and robotics department, according to a study from CNet. Security Cocoon is the name of the car’s language and has been fully tested on the track. Sony said the road check was carried out to make sure the car complies with all safety regulations before exhibiting the car.

The Sony prototype car has also been equipped with Flight Time sensors that can detect and recognize individuals, objects inside and outside of the car. The car has the so called Level 2 Autonomous Fitness. In other words, the autonomous software can continue to drive, stop and even look. In case the system fails, drivers must, however, be supervised.

Sony provided very little additional information on the vehicle, according to several sources. However, it was a 4-seater car with two 200kW engines that became evident in some specs. The car can range from 0 to 60 mph in less than 5 seconds, according to an Engadget report. It also has a 149 mph high speed. Remember that no word exists as to whether Sony is formally going to ship or sell this vehicle. It’s still a prototype or a concept car that displays the different technologies from Sony. That said, you don’t know, in the not so distant future, if a Sony car is on the roads.





AI in the past decade and in 2020

Reading Time: 2 minutes

We may easily notice that AI is going to be the over-arching theme of how the world will change in the 2020s. Nevertheless, technological innovation is still in its infancy, and in 2010-2020 several industries have not lived up to the promise and speculation. We take future trends seriously and have found a great deal of writing about this topic that misses the mark.

AI trends and developments in digitalization are very much to be seen. There are wider technical developments between these two.

In the last decade, there were still plenty of innovations and devices such as

  • Virtual Reality (VR)
  • Chat Bots
  • Augmented Reality
  • Blockchain
  • IoT
  • Quantum Computing

that didn’t go so well over real terms.

The decade was completely mobile and rapid deployment was the way we might expect the BCI to reach mass adoption in the coming decade. The push into automated shops and electric vehicles is a real trend in the next decade, but the speculation must be differentiated from reality.

Autonomous vehicles, conventional quantum computing, better self-learning AI, are waiting on a second! Even digital currencies are being implemented exponentially more rapidly.

A few years show a lot of change, from computers to the Internet and smartphones. Yet science is never idle. Publicity has stepped up a world of standardization security capitalism and now is the AI-arms race.

Most technology trends and AI listicles only touch the surface of how humans are embedding technology increasingly into their lives. Moreover, it offers a more complete view from a variety of industries and from through stacks of technology and innovation.

The real world and the customer experience are the ultimate indicators of new innovations and new technologies. 3D, quantum and AGI will take several decades to develop, but an age of biotechnology and AI are imminent in wellness, education, and finance.

In terms of how technology changes our lives and it is an exciting and frightening time to live, it is perhaps the 2020s the most impactful transformative decade of the 21st century.

By the 2020s, dreamers and pretenders will be isolated from true innovators and every second we will deal with them.





Reading Time: 2 minutes

Any failures or unplanned downtime can be costly at the vast Pernis refinery in Rotterdam, where Royal Dutch Shell processes 20m tons of crude oil per year. At Europe’s largest refinery, the machinery and operating conditions are tracked using 50,000 sensors that generate 100,000 measurements per minute.

Shell began using machine learning last year to better analyze and process that data. The model was designed to predict control valve failures and enabled workers to perform maintenance or adjust operating conditions as required.


The work at Pernis is an example of how oil and gas companies, often months in advance, use artificial intelligence and machine learning to detect problems before they happen. Since then, Shell has extended the program to 19 assets.

“We have stopped two trips with benefits of about $2 m in just the first two weeks since implementing this,” says Alexander Boekhorst, Royal Dutch Shell’s vice president for digitalization and computational research.

Royal Dutch Shell invests heavily in artificial intelligence research and development, which it hopes will address some of its most pressing challenges. From meeting the demands of a transitioning energy market, urgently in need of cleaner and more efficient power, to improving safety on the forecourts of its service stations, AI is at the top of the agenda.

Current programs include implementing reinforcement learning in its system of exploration and mining to reduce the cost of extracting the gas that still drives a significant proportion of its revenue.

During the data strategy development, Daniel Jeavons, Shell’s general manager for data science said “What it means in practice is that we as a data science team are in a great position because we can make our current business more effective, more efficient, more reliable, safer – by applying AI into those settings. But we can also play a role in creating some of the new business models that we want to create, and that’s really exciting because we’re playing our part in taking Shell into the next generation of energy sources, new fuels, and new sources of revenue.”

Shell rolls out AI at its public electric car charging stations elsewhere over its global business to manage the shifting demand for power throughout the day. It also has computer vision-enabled cameras installed at service stations that can detect customers lighting cigarettes–a serious hazard.

Shell will use machine learning to reduce COGS in the short term for predictive maintenance and output optimization. First, Shell can predict when equipment will malfunction then repair it before it fails by gathering sensor data from the equipment in the field. Preventing unplanned downtime of its resources has reduced costs and since implementation has saved around 3.5 million barrels of lost production. Nevertheless, due to the abundance of data, all the above uses of machine learning are valid. Still, the question “Can machine learning be leveraged to eliminate next high risk with less relevant data?” remains unanswered.