SESSION

The story of having suffered death after architecturing Open Sources in the AI Project REVIEW

PREVIEW
In the semiconductor manufacturing process within SK Hynix, the number of images taken from parts(semiconduct) is so large that it is difficult to determine the quantity even in a short period of time. In addition, creating a deep learning model that verifies defective parts and normal parts by analyzing data with varying patterns of images according to the manufacturing process requires a lot of effort and time, plus, managing and updating models for each process need more resources. The entire flow of data pipeline design for each process, model training and deployment through A/B test, and monitoring was designed with ambition through Apache Nifi, but I would like to share a case where he have experienced many growth pains in the face of more problems.
  • Han Ki Hoon
  • Han Ki Hoon / SK Hynix Data Science Pro
  • He is developing a model that verifies good and defective products based on Deep Learning technologies by analyzing photographed images of parts produced in the semiconductor manufacturing process, and performs stable model serving in operation and manage model life cycle.

Feel free to leave comments/curiosity/opinions about the session.

* Comments (including swearwords and slander) that do not match the purpose of the bulletin are deleted by the administrator without notice.

* Comments can only be uploaded while the sessions are ON–AIR and updated every minutes.

0/100 characters WRITE