In general, many services are provided by having an AI model in the server and accessing the server from the device. However, providing services through servers is risky in terms of cloud server operating costs and data security. Therefore, in recent years, studies have continued to learn and infer AI models within devices without using servers. Since the device has fewer resources than the server, there are things to overcome to run the model on the device. NNStreamer and NNTrainer are open source projects to solve these problems and provide developers with convenient and effective methods. This session introduces the open source frameworks NNStreamer and NNTrainer for learning and reasoning in On-Device.
Samsung Electronics