Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Revolutionary AI Breakthrough: Experience an Unparalleled Transformation of Your iPhone with Apple’s Latest Research

LLM in a Flash

Published on December 12, the new study titled “LLM in a Flash: Efficient Large Language Model Inference with Limited Memory” can revolutionize the iPhone experience. It could bring a more immersive visual experience and make complex AI systems accessible on iOS devices. A recent research paper by Apple reveals a groundbreaking method that can assist in implementing AI on iPhones.

New: 10 AI ML In Personal Healthcare Trends To Look Out For In 2024

3D animated avatars from single-camera footage

Apple researchers present HUGS (Human Gaussian Splats) as a method to create 3D animated avatars from single-camera footage in the first study. In a statement made by principal author Muhammed Kocabas, the researchers claimed that their system could automatically separate a static scene from an animated human avatar in as little as 30 minutes using only a monocular video with a modest number of frames (50-100).

Read: Top 10 Benefits Of AI In The Real Estate Industry

Future demands of AI-infused services

Apple is looking ahead to the future demands of AI-infused services as it considers incorporating these breakthroughs into its product selection, which might improve its gadgets even further. If Apple’s new memory-allocation feature works as advertised, it might pave the way for a whole new category of apps and services to take advantage of LLMs in ways that weren’t possible before.

In addition, Apple is contributing to the larger AI community by publicizing its research, which could encourage other improvements in the field. That Apple is willing to do this shows how seriously it takes its role as a technological leader and its dedication to expanding human potential.

Related Posts
1 of 895

Read:The Top AiThority Articles Of 2023

Flash storage optimization

Using flash storage optimization, this method streamlines large LLMs. Another major development will occur when Apple incorporates sophisticated AI inside the iPhone. Two new research papers showcased this month by the Cupertino-based tech behemoth declared substantial advancements in AI. The study uncovered novel methods for efficient inference of language models and 3D avatars. This research delves into the difficulty of keeping model parameters in flash memory, running them into DRAM on demand, and executing LLMs that use more DRAM than is available. Data transfers from flash memory can be optimized with the use of the Inference Cost Model, which takes flash and DRAM characteristics into account.

To back up their claim, the researchers have utilized models like Falcon 7B and OPT 6.7B. According to the research, compared to conventional approaches, the models increased CPU speed by 4-5 times and GPU speed by 20-25 times.

Read: State Of AI In 2024 In The Top 5 Industries

Why the users should be happy?

Users of Apple products, such as the iPhone, may profit substantially from the results of the study on efficient LLM inference with limited memory. Users will get access to greater AI capabilities with strong LLMs running efficiently on devices with limited DRAM, like as iPhones and iPads. Better language processing, smarter voice assistants, better privacy, maybe less internet bandwidth utilization, and, most significantly, making advanced AI available and responsive to every iPhone user—these are all features that come with the iPhone.

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.