The Dimensity 9300 will integrate Meta's Llama 2 large language model, allowing generative AI applications to work entirely on-device without having to go through the cloud.
MediaTek touts several advantages to offering generative AI on-device, including "seamless performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost."
MediaTek's Dimensity portfolio already has APUs that offer generative AI features, including AI Noise Reduction and AI MEMC, with devices leveraging the Dimensity 9200 — like the Vivo X90 Pro and Find X6 — highlighting these features.
MediaTek wants to roll out a software stack optimised to run Llama 2, an upgraded APU with Transformer backbone acceleration, reduced footprint access and use of DRAM bandwidth to facilitate better on-device generative AI use cases.
The company has not named the chip anywhere and officially only says that these features will debut in a next-gen flagship SoC set to debut later in the year.
Mediatek notes that the initial wave of phones powered by the hardware will be available before the end of the year, so it isn't likely to assume that the brand has secured a design win with one of the major Chinese manufacturers.