Wireless charging technology may sound like high techy magic, but actually, it’s not. The idea of charging a device without being plugged in has existed over 120 years. The technology, known as magnetic induction, dates back between 1888 and 1906 where Nikola Tesla spent most of his years working on “Tesla Tower”. It worked on the principle of converting AC into high voltage with increased frequency. Even though this prototype failed on the outdoor scale, if we think about it, the same principle is used on a much smaller and quitter scale to charge today’s smartphones.
In the past decade, wireless charging has matured through the formation of standards groups like the Wireless Power Consortium, and through advancements and adoptions made by mobile device makers, like Samsung in 2014, and Apple in 2017. Wireless charging technology consists of direct-contact charging pads that uses tightly-coupled induction technology, which in turn creates an electromagnetic field between two copper coils.
A study conducted by Survey Monkey, found that wireless charging technology was the most anticipated features in iPhones for the year 2017. Yet, the study stated that only 29% used wireless charging in that year. The reason? Even though the name suggests wireless charging, it isn’t truly wireless. You would need accessories like charging pads that you set your phone on and these accessories themselves had to be attached to a power source. So what’s the point? Convenience! Charlie Quong, VP of product development at Mophie, said placing wireless chargers in areas where people spent a lot of time would remove the hassle of having to plug in their phones from time to time. This led to anticipation that wireless charging technology will become ubiquitous. WiTricity’s CTO Morris Kesler estimated that over 2 billion devices will be shipped with wireless charging in a year by 2025.