AI case study

KakaoAI model training

GPU clusters bottlenecked Kakao's AI. Switching to Cloud TPUs, the team upcycled a 2.1B model into a 13.4B parameter MoE system.

Published|6 months ago

Key results

Model Parameters
24.6B
MoE Model Parameters
13.4B

Result highlights

Unlock 2 result highlights

The story

Context

South Korea's leading technology company operates the country's dominant messaging platform along with a vast ecosystem spanning digital payments, mobility, and entertainment.

Challenge

Developing a proprietary Korean-language AI model required massive computing power, but scaling GPU clusters beyond single server units caused...

Solution
Unlock full story

Quotes

Unlock 7 more quotes

Use case

Kakao's AI model training is part of this use case:

AI Infrastructure
86 case studies(+97% YoY)
Proven impact?
3.7Moderate
LowModerateStrongVery Strong3.7
3.0Moderatewithin Software & Platforms
3.5Moderatewithin Product Engineering

The company

Mobile messaging platform and internet services for the South Korean market.

IndustrySoftware & Platforms
LocationJeju-si, Jeju-do, South Korea
Employees1K-5K
Founded2010

The vendor

Cloud computing services, AI infrastructure, and data analytics platforms for enterprises.

IndustryTechnology
LocationMountain View, CA, USA
Employees100K+
Founded1998

The implementation partner

Megazone Soft logo

Megazone Soft

megazonesoft.com
Role in this case study

Supported Kakao's AI and cloud transition to develop its 'Kanana' language model.

IndustryTechnology
LocationSeoul, South Korea
Employees51-250
Founded2018

Similar Case Studies

Related implementations across industries and use cases

93 AI case studies in AI Infrastructure

618 AI case studies in Software & Platforms

1,313 AI case studies in Product Engineering