Fabrics & Processing | Market Reports | News & Insights

MIT researchers find new approach to streamline knitting.

Published: August 8, 2019
Author: TEXTILE VALUE CHAIN

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a new approach to streamline the knitting process: a new system and design tool for automating knitted garments. The new system called ‘Inverse Knit’, translates photos of knitted patterns into instructions that are then used with machines to make clothing.

A paper on InverseKnit was presented by AlexandreKaspar, CSAIL Phd student, alongside MIT postdocs Tae-Hyun Oh and PetrKellnhofer, Phd student LianeMakatura, MIT undergraduate Jacqueline Aslarus, and MIT professor WojciechMatusik, at the International Conference on Machine Learning this past June in Long Beach, CA. Programming machines for designs can be a tedious and complicated ordeal— when every single stitch has to be specified and even one mistake can throw off the entire garment. An approach like ‘Inverse Knit’ could let casual users create designs without a memory bank of coding knowledge, and even reconcile issues of efficiency and waste in manufacturing, according to a press release by MIT. “As far as machines and knitting go, this type of system could change accessibility for people looking to be the designers of their own items,” said Kaspar.

“We want to let casual users get access to machines without needed programming expertise, so they can reap the benefits of customisation by making use of machine learning for design and manufacturing.”

In another paper, researchers came up with a computer-aided design tool for customising knitted items. The tool lets non-experts use templates for adjusting patterns and shapes, like adding a triangular pattern to a beanie, or vertical stripes to a sock. Users can make items customised to their own bodies, while also personalising for preferred aesthetics.

To get InverseKnit up and running, the team of CSAIL first created a dataset of knitting instructions, and the matching images of those patterns. They then trained their deep neural network on that data to interpret the 2-D knitting instructions from images. When testing InverseKnit, the team found that it produced accurate instructions 94 per cent of the time.

Related Posts