SqeezeBits, AI Lightweight Technology Startup, Secures $2 Million (KRW 2.5Billion) Pre-Series A Funding
SqueezeBits(스퀴즈비츠), an AI lightweight technology startup, announced on Thursday that it has raised $2 million (KRW 2.5 billion) in pre-Series A funding from Kakao Ventures, Samsung Next, POSCO Technology Investment and Postech Holdings.
SqueezeBits is developing AI lightweighting technology to enable the efficient use of AI-based services. Lightweighting reduces the amount of memory and computation required for AI model inference, enabling faster computation. SqueezeBits quantises 32-bit data to 4 bits or less while maintaining the performance of the AI model. This is a key technology for faster and lighter AI models. It also has a software engine that can efficiently compute quantized models on existing hardware. With so many AI-based services being launched, including ChatGPT, it is expected to reduce operational costs, one of the biggest barriers to widespread AI adoption.
SqueezeBits’ lightweight AI technology can be deployed in a variety of environments, including mobile smartphones, edge devices including laptops, and GPU clouds. It supports multiple models, including image, video, speech and natural language, enabling a wide range of applications. The company has already completed proof of concept (PoC) and projects with more than 20 companies, including NAVER and SK Telecom, deploying lightweight technology for AI services in various fields, including super-large AI models. Recently, the company unveiled an on-device AI demo that runs LLM, a super-large language model, in real time on edge devices, attracting attention from the industry. The company also unveiled technology to run our AI Stable Diffusion model for image generation in real time on a smartphone.
With the investment, the company plans to secure its competitiveness in lightweighting technology and seriously enter the overseas market. The company recently launched the OwLite toolkit, which enables non-specialists unfamiliar with lightweighting to easily lightweight, compare and analyse AI models. The toolkit currently supports NVIDIA GPUs and will be expanded to include hardware from Intel, AMD, ARM and NPUs from AI semiconductor startups.
SqueezeBits was founded by a team of deep learning accelerator hardware (Neural Processing Unit (NPU)) researchers at Postech Graduate School. Over the past seven years, the co-founders have consistently published lightweighting papers at the world’s top machine learning conferences, including CVPR, NeurIPS, and ICLR. To date, they have published more than 70 international papers on deep learning acceleration. They also have experience in designing AI-specific hardware. The team is said to be highly skilled in implementing optimisation techniques from AI algorithms to hardware.
Jungho Shin, VC at Kakao Ventures, said of the investment, “SqueezeBits is a team that can lead the democratisation of AI applications by providing solutions across the AI value chain based on its hardware and software expertise.”
Hyungjun Kim, CEO of SqueezeBits, said: “For AI-powered services to move from customer acquisition to monetisation, they need to dramatically reduce the cost of AI operations. With SqueezeBits’ AI lightweighting technology, we will help solve the cost and efficiency issues many companies face and maximise the potential of AI technology.”