Founded in the year 2012, Austin-based Mythic is helping realize this vision by pushing local AI beyond today’s edge to transform every device into an intelligent and trusted personal assistant. The source of inspiration for Mythic’s ingenious technology traces back to 2012, when Mike Henry and Dave Fick, the company’s CEO and CTO respectively, graduated with doctorates in computer science. The friends placed a bet on massive powerful compute resources in appliances and devices used on a daily basis to be the enabler of next-gen machine learning (ML) and deep neural network-powered approaches to software development. Back then, people didn’t term it ‘AI,’ and they soon realized that deep learning is the key to infuse human-like capabilities into any device, which would allow them to interact, hear, and see as intelligent machines.
After years of hard work and innovation, the CEO believes that the bet has finally started to pay off. In its early years, Mythic, then known as Isocline, focused on building embedded chips to equip surveillance drones with the ability to run software modeled after the human brain. In doing so, Henry and his team of experts realized that many such applications and gadgets consume considerable data to send information to the cloud. For instance, drones collecting numerous high-resolution images through the cameras help pinpoint a crack in a turbine or maybe a dry patch on a farmland, processing all that data on the cloud and running advanced algorithms on them will drain the batteries faster and will eventually have to land even if the task is not accomplished. And this is precisely where Mythic saw a big opportunity—processing data within a device locally.
Mythic’s AI system relies on things like summing and combining electrical currents, rather than classical methods of computation such as processors and memory
Mythic’s hardware and software platform severs the connection between local AI and the cloud or the typical on-device technologies that more often than not limit sophistication. Where other platforms on the market fall short of the potential to meet project goals due to high-energy consumption, Mythic’s platform delivers the power of local AI with minimal battery drain. Should an opportunity arise, Mythic pushes boundaries to get there first.
Unleashing the Potential of AI on Every Device
A unique aspect of Mythic’s platform is that it is not weighed down by massive compute capabilities like conventional local AI, performing hybrid analog/digital calculations inside flash arrays. The novel approach executes the inference step associated with deep neural networks within the memory array that can store the processing weights long term, facilitating a boost in performance, accuracy, and power life. Clearly, the platform makes no compromise on battery life or throughput while pushing local AI to an appliance.
Mythic’s platform renders the power of desktop GPU encapsulated in a button-sized chip that efficiently supports exhaustive multi-million weight neural networks. Leveraging rapidly scalable technology, it delivers immense parallel computing as and when intelligence is needed. The deep learning neural network chip implements learning algorithms at a radically lower price and consumes lesser power as compared to other computing chips available on the market. With the chip, one can incorporate machine intelligence into a security camera, a dishwasher, a toaster, all kinds of appliances which none would have considered viable before without a consistent internet connection. Turning devices into smart assistants unconnected to the cloud, Mythic’s platform makes them inherently more secure to run on.
“Mythic’s AI system,” Henry says, “relies on things like summing and combining electrical currents, rather than classical methods of computation such as processors and memory that stall out for AI because of fundamental bottlenecks in moving data between the processor and memory, and also by the end of Moore’s law scaling.” Mythic is instilling smartness and more power in its ingenious machine learning chips at an exponential rate, which was previously hindered due to physical limitations, by eliminating processors and computing inside the memory. “Doing this was a significant undertaking and required leaving the comfortable world of binary 1’s and 0’s,” Henry adds.
Unlike Nvidia that primarily focuses on server chips utilized in training, Mythic emphasizes embedded inference. According to Fick, inference poses to be a more relevant problem than training. “The capabilities of the inference platform dictate your ability to deploy algorithms in the field. You could just double the size of your server farm, but that doesn’t really affect anything happening out in the field,” he says. As such, the company is on its way to sampling chips built on a competitive, ambitious architecture that leverages analog computing within flash memory cells to accelerate and enhance machine-learning tasks.
Fick informs that the processor will demonstrate unprecedented efficiency and possess the capability to enable security cameras to process facial recognition without the help of the cloud. This will, in turn, strengthen privacy and conserve network bandwidth in the process, and also pay off with lower latency for industrial controls or autonomous cars.
Mythic is heavily invested in processing inside blocks of flash memory, and running ML more efficiently and faster than other chips available on the market. Fick mentions, “That in-memory processing eliminates the high power consumed by traditional chips pumping information in and out of external memory. Moving data costs energy.” Mythic ensures minimum movement by running inference algorithms in the same flash memory cells that consist of millions of weights embedded into the machine-learning model.
"Mythic is heavily invested in processing inside blocks of flash memory, and running machine learning more efficiently and faster than other chips available on the market"
Mythic opts for analog computing instead of digital signals as analog current lets it save power that is usually consumed in switching transistor voltages constantly. Inside Mythic’s chips, all the memory cells drive simultaneously, which reduces the travel distance for data into the processor along with automatically handling additional operations without using any power. Mythic excels at combining blocks of memory together into a parallel processor, and each of these blocks utilizes digital currents to convert instructions to analog current. Being reconfigurable, the digital circuits prevent analog currents from swarming over other parts of the processor.
A Promising Road Ahead
Mythic has already raised $40 Million from new and existing investors that include Lux Capital, Draper Fisher Jurvetson, SoftBank Ventures, and more. The new cash will enable Mythic to hire more top-flight engineers “from companies like Oracle and TI—engineers with decades of production experience,” Henry said. The company recently introduced a machine-learning accelerator, an object detection processor, and related software tools for smartphones and security cameras. With more companies entering the ML market, Mythic’s margin for error has narrowed. Mythic aims at being at the forefront of the ML inferencing market, innovating and evolving at a constant pace.