Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS AND METHOD FOR E-KNITTING
Document Type and Number:
WIPO Patent Application WO/2018/092128
Kind Code:
A1
Abstract:
The invention is an apparatus for e-knitting, which essentially pertains to real-time guidance of an knitting process using recorded knitting actions as reference for real- time knitting. The apparatus that facilitates e-knitting comprises: e-needles, which are knitting needles that comprise location and orientation sensors, which are configured to monitor location and orientation of the needles in 3D (three-dimensional) space, error alarm means that is configured to inform a user on knitting errors, communication means, controller and power source, digital medium with an app stored thereon which is configured to receive real-time location and orientation data of the e-needles in 3D space and record or compare them to reference knitting actions stored on the digital medium, and a communication network for communicating between the communication means on the e-needles and the digital medium.

Inventors:
ERENEST, Emma Margarita (Perets Hayut st. 5, 14 Tel-Aviv, 6326214, IL)
SHARABI, Nadav (Nachlat Binyamin st. 153, 38 Tel Aviv, 6606638, IL)
Application Number:
IL2017/051240
Publication Date:
May 24, 2018
Filing Date:
November 15, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ERENEST, Emma Margarita (Perets Hayut st. 5, 14 Tel-Aviv, 6326214, IL)
International Classes:
D04B3/02; G06F19/00
Foreign References:
US8615319B22013-12-24
Attorney, Agent or Firm:
YUSHKOV, Nikita (Shenhav Konforti & Co. - Attorneys, P.O.B. 29671 9 Ahad Ha'am st, Tel-Aviv 01, 6129601, IL)
Download PDF:
Claims:
Claims

1. An apparatus for e-knitting comprising:

e-needles comprising location and orientation sensors configured to monitor location and orientation of said e-needles in 3D space, error alarm means configured to inform a user on knitting errors, communication means, controller and power source;

non-transitory digital medium comprising an app stored thereon configured for receiving real-time location and orientation data of said e-needles in 3D space and recording and/or comparing said location and orientation data to reference knitting actions stored on said digital medium; and

communication network for communicating between said communication means in said e-needles and said digital medium. 2. The apparatus according to claim 1, wherein said e-needles comprise a pair of left and right needles, said alarm means in left and right needles are configured to operate separately from each other in response to different knitting errors.

3. The apparatus according to claim 1, wherein said sensors are 3D space location and orientation sensors configured to monitor movement in 3D space.

4. The apparatus according to claim 1, wherein said sensors are selected from

gyroscope, accelerometer and magnetometer and any combination thereof. 5. The apparatus according to any one of claims 1-4, wherein said sensors are

configured to monitor movement of said e-needles in space by any one of the ways selected from monitoring movement paths of said e-needles, identifying extreme points, diversion points or turning points in said movement paths of said e-needles in space, monitoring movement of said e-needles within a selected range according to selected sequence of actions or order of movement, measuring relative location of said e-needles to each other, each needle to itself or both at the same time, measuring location of said e-needles relative to absolute selected coordinate system and any combination thereof.

6. The apparatus according to claim 1, wherein communication network is wireless communication network selected from internet, Cellular, Wi-Fi and Bluetooth.

7. The apparatus according to claim 5, wherein said wireless communication means is configured for wireless communication according to suitable wireless protocols selected from GSM, CDMA, UMTS, CDMA2000, TD-SCDMA, GPRS, EDGE, Bluetooth, IEEE 802.11, IEEE 802.15.3, IEEE 802.15.4, IEEE 802.16, IEEE 802.20, IEEE 802.22, DECT, WDCT, UMA, HIPERLAN, BRAN and

HIPERMAN.

8. The apparatus according to claim 1, wherein said communication means is wire communication means comprising USB wire communication between said communication means and said digital medium.

9. The apparatus according to claim 1, wherein said sensors, error alarm means, communication means, controller and power source are enclosed in a knob, said knob is configured to be screwed on top end of said e-needles.

10. The apparatus according to claim 1, wherein said sensors, error alarm means, communication means, controller and power source are enclosed in a clip, said clip is configured to fasten at any selected location on said e-needles.

11. The apparatus according to claim 1, wherein said sensors, error alarm means, communication means, controller and power source are enclosed in a clip, said clip is made of metallic ferromagnetic material and configured to magnetically attach at any selected location on said e-needles, said e-needles are made of metallic ferromagnetic material.

12. The apparatus according to claim 1, wherein said digital medium is selected from smartphone, tablet, laptop, desktop, industrial knitting machine, AR glasses wearable device with recording, computing and digital data storing means and wire or wireless communication means.

13. The apparatus according to claim 1, wherein said digital medium is configured for receiving recording location and orientation data from said location and orientation sensors on said e-needles, storing said data as particular gestures of knitting, generating a file comprising a particular code, said code comprising a sequence of said location and orientation data equivalent to location and orientation movements of said e-needles and assigning a name for each sequence of said location and orientation data, said sequence forming said gesture, said gesture representing a particular stitch.

14. The apparatus according to claim 13, wherein said stitch is selected from CO

(Cast On), K (Knit), P (Purl), Mp (Mop) and GR (Garter).

15. The apparatus according to claim 13, wherein said digital medium is configured to guide a knitter through a knitting session further by monitoring said location and orientation movements of said e-needles and comparing them to stored location and orientation data of said gesture.

16. The apparatus according to claim 1, wherein said digital medium is configured to display knitting instructions graphically and/or textually, wherein said instructions are a sequence of knitting stitches that form a knitting pattern selected by a knitter.

17. The apparatus according to claim 16, wherein said digital medium further

comprises a database or is configured to communicate with a database, said database comprising a plurality of knitting patterns for a knitting session.

18. The apparatus according to claim 15, wherein said digital medium is further

configured to diagnose a particular position of said e-needles, analyze progress in a knitting session relative to an elected knitting scheme, identify an error in movement or position in space of said e-needles or a stitch made and send a command to said controller in said e-needles to set an alarm off.

19. The apparatus according to claim 18, wherein said alarm is selected from visual, audible, vibrational, on screen graphic and/or on screen text alarm.

20. The apparatus according to claim 19, wherein said visual alarm is a light bulb, said light bulb is configured to flash with a selected color according to

predetermined color code when set off.

21. The apparatus according to claim 19, wherein said on screen graphic on screen alarm is a flashing icon at a current location of a stitch in a graphical 2D presentation of a knitting pattern, said icon is of a particular symbol and/or color according to a predetermined color code for a corresponding status or state of a knitting action in said knitting session.

22. The apparatus according to claim 21, wherein said states and status are selected from knitting stitches, knitting errors, knitting states, level of progress in said knitting session and speed of knitting.

23. The apparatus according to any one of claims 19-22, further comprising on screen of graphic and/or textual display of rating performance in said knitting session.

24. The apparatus according to claim 1, wherein a single or plurality of reference points, lines, planes are defined for alignment setup of said e-needles, wherein alignment of said e-needles to said reference points, lines, planes is done in a method selected from manual alignment methods, alignment using said e-needles orientation sensors or GPS means and a single or plurality of mechanical, magnetic, electromagnetic or wireless alignment fixtures, wherein alignment fixtures are utilized to define said reference points, lines and planes or align said e-needles with said reference points, lines and planes.

25. The apparatus according to claim 1, wherein the alignment fixtures are embedded inside said digital medium.

26. The apparatus according to claim 1, wherein said sensors, error alarm means, wireless communication means, controller and power source are introduced into knob configured to be screwed on upper top of said e-needles. 27. The apparatus according to claim 1, wherein said sensors, error alarm means, wireless communication means, controller and power source are mounted on said e-needles with a clip mechanically attached to body of said e-needles.

28. The apparatus according to claim 1, wherein said e-needles are made of metallic ferromagnetic material, wherein said sensors, error alarm means, wireless communication means, controller and power source are enclosed in a clip made of ferromagnetic material configured to magnetically attach to body of said e- needles, said clip is mounted on said e-needles at any selected location.

29. The apparatus according to claim 1, further comprising a smart glove comprising space location identification means, said smart glove is configured to be worn on hand and monitor changes in palm and finger movement in a knitting action, said glove comprising means for identifying and tracking location in space of said hand, which is similar to said means mounted on said e-needles.

30. The apparatus according to claim 1, further comprising a camera configured to visually monitor a knitting action and location change and movement of said e- needles and transmit digitally encoded visual information to said digital medium, said digital medium is configured to receive said visual data, decode and analyze said data and retrieve required information on state and location of said e-needles and knitting process.

31. The apparatus according to claim 30, wherein said camera is stills camera

configured to take pictures at a selected frame speed.

32. The apparatus according to claim 30, wherein said camera is a video camera

configured to continuously monitor a knitting session.

33. The apparatus according to any one of claims 30-32, wherein said digital medium is configured to store said visual information from said camera until end of a selected knitting session.

34. The apparatus according to any one of claims 30-32, wherein said digital medium is configured to transmit said visual information from said camera to a particular database in a data space allocated for a particular knitter on a data storage cloud.

35. The apparatus according to claim 16, wherein said knitting instructions are

recorded in hand knitting pattern map, charts and/or text.

36. The apparatus according to claim 16, wherein said knitting instructions are recorded in industrial knitting machine code and pattern map.

37. A method of recording e-knitting comprising:

obtaining an apparatus for e-knitting as claimed in any one of claims 1-36;

receiving location and orientation data from said location and orientation sensors mounted on said e-needles in said digital medium;

recording and storing said data as particular gestures of knitting;

generating a corresponding file for every type of gesture, said file comprising a particular code that comprises a sequence of location and orientation data equivalent to location and orientation movements of said e-needles; and assigning an id for every gesture that identifies a particular stitch with a unique particular code. 38. A method of guiding e-knitting session comprising:

obtaining an apparatus for e-knitting as claimed in any one of claims 1-36;

displaying knitting instructions to a knitter, graphically and/or textually, said instructions being a sequence of knitting stitches that form a selected knitting pattern;

receiving location and orientation sequence movements from said location and orientation sensors on said e-needles;

transforming said movements sequence to corresponding actual code;

comparing said actual code to registered code of a type of stitch gesture that should be done;

signalling a knitter on said type of stitch that should be knitted; and

in case of mismatch between said actual code and said registered code, signalling said alarm in said e-needles to set off.

39. The method according to claim 38, wherein said guiding e-knitting session is a walk-through session comprising continuous instructions for knitting stitches to said knitter in selected knitting pattern.

40. The method according to claims 35 or 36, wherein said type of stitch is selected from CO (Cast On), K (Knit), P (Purl), Mp (Mop) and GR (Garter).

41. The apparatus according to claim 1, further configured for guiding e-knitting in inline session for a single or plurality of students in single or plurality of instructors, said apparatus further comprising:

- a microphone and a voice or text to speech instructions means;

sharing network means between different digital media and users comprising multi-user media, a plurality of inline instructive means, software and hardware communication means; and

a plurality of screens for inline communication between e-knitting users and instructors.

Description:
Apparatus and Method for E-Knitting Technical Field

The present invention pertains to e-knitting. More particularly, the present invention pertains to apparatus and method for real-time guidance of knitting actions using location and orientation sensors, digital platform and wireless communication.

Background

Knitting is a craft that requires skill and training. Acquiring this craft is usually done by direct teaching with a human guide that observes the pupil's work and corrects her mistakes. Learning by imitation is also possible using visually recorded knitting lessons or printed manuals. All these methods do not consist onsite real-time supervision on the knitting process and cannot amend every knitting error in real-time. There is, therefore, a need for real-time means and method for guiding the knitting process and supervising the knitter actions and errors.

It is, therefore, an object of the present invention to provide a system for real-time instructing and guiding knitting by a user.

It is yet another object of the present invention to provide a method for real-time instructing and guiding knitting by a user.

This and other objects and embodiments of the present invention shall become apparent as the description proceeds.

Summary

In the broad scope of the present invention, e-knitting essentially pertains to real-time guidance of the knitting process using recorded knitting actions as reference for real- time knitting. The apparatus that facilitates e-knitting comprises:

e-needles, i.e., knitting needles that comprise location and orientation sensors, which are configured to monitor location and orientation of the needles in 3D (three- dimensional) space, error alarm means that is configured to inform a user on knitting errors, wireless communication means and power source; digital medium with an app stored thereon which is configured for receiving real-time location and orientation data of the e-needles in 3D space and recording or comparing them to reference knitting actions stored on the digital medium; and

wireless communication network for communicating between the wireless

communication means on the e-needles and the digital medium.

The communication between the e-needles and the digital medium is essentially bidirectional. The location and orientation sensors on the e-needles pick the needles movement by moving together with them in the knitting action. Wireless

communication means in the e-needles communicate the dynamics of location and orientation to the digital medium through a wireless network. In the opposite direction, the digital medium receives the location and orientation data from the location and orientation sensors in real-time and processes it according to selected mode of operation.

In one embodiment of the present invention, a single or plurality of reference points, lines and planes are defined for the alignment set up of the e-needles. The alignment of the e-needles to these reference points, lines and planes can be done in several methods such as, manual alignment methods, alignment using the e-needles orientation sensors or GPS means, or using single or plurality of mechanical, magnetic, electromagnetic or wireless alignment fixtures. The alignment fixtures can be utilized to define the reference points, lines and planes for the movement of the e- needles within the borders defined for any selected e-knitting program. Further, the fixtures can be used to align the e-needles with the reference points, lines and planes. In one embodiment the alignment fixtures can be embedded in the digital platform.

In connection with the movement of the e-needles, this movement can be monitored in several different modes during knitting. Accordingly, the relative positions of the e- needles can be determined in corresponding ways. Namely, the e-needles are assigned particular space coordinates according to the particular elected mode for determining the relative locations of the e-needle in space. The relative locations change dynamically with the movement of the e-needles in the knitting process and are continuously assigned updated coordinates, which are consistent with the elected mode. Particular but not exclusive examples for dynamic and continuous monitoring of space locations of the e-needles are detailed as follows:

1) Monitoring the movement paths of the e-needles.

2) Identifying extreme points, diversion points or turning points in the path of

movement of the e-needles in space.

3) Monitoring movement of the e-needles within a selected range according to

selected sequence of actions or order of movement, e.g., right then left, then down and so on, according to the particular stitch for knitting.

4) Measuring relative location of the e-needles to each other, each needle to itself or both at the same time.

5) Measuring location of the e-needles relative to absolute selected coordinate

system.

In general, there are two modes of operation for the digital medium: recording and guiding. In recording mode, the digital medium receives location and orientation data from the location and orientation sensors which are mounted on the e-needles and records and stores them as particular gestures of knitting. In fact, every knitting gesture is essentially a unique sequence of movements, each movement being defined by particular location and orientation in 3D space. Accordingly, the digital medium generates a corresponding file for every type of gesture, the file comprising a particular code that comprises a sequence of location and orientation data equivalent to the location and orientation movements of the e-needles. The digital medium then assigns a name for every gesture that identifies the particular stitch with its unique particular code. Some typical basic examples of stitches are provided in the accompanying Figures, which include CO (Cast On), K (Knit), P (Purl), Mp (Mop) and GR (Garter). This is only a partial list of types of stitches and does not include all types that can be monitored and recorded in the present invention.

In the guiding mode, the digital media performs the following operations:

displaying knitting instructions to the knitter;

receiving location and orientation movements sequence from the location and orientation sensors on the e-needles;

transforming the movements sequence to corresponding actual code; comparing the code to the registered code of the type of stitch gesture that should be done;

signalling the knitter on the type of stitch that should be knitted; and

in case of mismatch between the actual code and the registered code, signalling the alarm in the e-needles to set off.

In one particular embodiment, the knitting instructions are displayed graphically and/or textually. In another particular embodiment, these instructions are essentially a sequence of knitting stitches that form a knitting pattern selected by the knitter. In particular, the digital media comprises a database or connects to a database that comprises a plurality of knitting patterns that the knitter can select from for a knitting session.

In one particular embodiment, the digital medium is also configured to distinguish between the two needles, left and right, and accordingly signal the right or left e- needle to set an alarm off when identifying a knitting error.

In another embodiment of the present invention, the e-knitting method is utilized for guiding e-knitting in online session, the guiding comprising a single or plurality of students and a single or plurality of corresponding instructors. Accordingly, the e- knitting system further comprises:

- an internet, Cellular, Wi-Fi, Bluetooth or other wireless communication means;

- a microphone and a voice or text to speech instructions conversion means;

- sharing network means between different digital platforms and users comprising multi-user platforms, a plurality of inline instructive means and software and hardware communication means.

In a further embodiment of the present invention, a plurality of additional screens is added to the knitting app for monitoring the inline communication means between the e-knitting users and instructors.

The instructors perform online e-knitting practicing lessons, correcting the students e- knitting techniques by visual demonstration, which are recorded and translated into optional formats such as, hand-knitting pattern maps, charts, texts and other instructions. The instructor can pass the basic and advanced e-knitting exercises and other training methods. The instructors can also record e-knitting lessons and demonstrations which can be downloaded, for offline training purposes by users and students.

In another embodiment, the e-knitting digital platform software comprises a plurality of corrective e-knitting algorithms, which are configured to identify the user e- knitting errors with respect to the correct knitting pattern, debug it and suggest the required corrections to the knitter. In one embodiment, the detection debugging and suggested corrections are performed online, while the user carries out the e-knitting process. In another embodiment, the detection debugging and suggested corrections are performed after knitter knitting process is concluded or terminated. In these embodiments, the debugging can be executed automatically or selectively per the knitter request or command.

The following describes particular examples and embodiments of the present invention with reference to the accompanying drawings and without limiting the scope of the present invention. Brief Description of the Drawings

Figs. 1A-1B illustrate the recording and instructing of an e-knitting action by a digital media.

Fig. 2 illustrates a magnified view of first type of e-needles for e-knitting in the present invention.

Fig. 3 illustrates a magnified view of second type of e-needles for knitting in the present invention.

Figs. 4A-4B illustrate the major components of the means for e-knitting, namely e- needles and digital media.

Figs. 5A-5B illustrate another option of the major components of the means for e- knitting.

Fig. 6 is a flow diagram of initiating the process of e-knitting.

Fig. 7 is a flow diagram of recording a process for e-knitting according to the present invention.

Fig. 8 is a flow diagram of real-time guidance and supervision of e-knitting. Figs. 9A-9Z shows screen flow display of the accompanying app walking the user through the different screens and steps of e-knitting.

Detailed Description of the Drawings

Figs. 1A-B show an illustration of constructing e-knitting guiding lesson for a particular knot knitted by hand. The invention essentially relates to the concept of IOT (Internet Of Things) for connecting between daily or regularly used devices, such as knitting needles, and digital platforms. Particular sensors are mounted on the needles (see Figs. 2 and 3 and corresponding description below) that monitor the user knitting gestures, respond to these gestures in a particular manner dictated by their method of monitoring and type of gesture, translate their response to electromagnetic signals and transmit these signals to the digital medium. The digital medium receives the signals and translates them to particular code that corresponds to the type of gesture that is monitored. Particularly, Fig. IB illustrates a hand holding an e-knitting needle (Id) with a thread held by the hand and partially wrapped on the needle (Id). Orientation sensor and wireless transmission means are mounted on the needle (shown in Figs. 2 and 3). The orientation means detects the angle that that needle is held in 3D space relative to the plane of knit ting. The value of the angle is recorded on a digital memory recording means on the e-needle. The wireless transmission means receives the angle value and transmits it to the digital platform, which is presented as the display screen in Fig. 1A. In the particular example in Fig. 1A, the digital platform guides the knitter to knit a stitch of the type CO (Cast On) in a relative location in 3D space, which is translated to a three digit presentation, e.g., 003. As a further guiding aid, the digital platform shows the knitter the details of the previously knitted stitch, namely type, number and row, to ensure ordered continuity and consistency of the knitting process and final knitwear. In a further embodiment, the digital platform records and/or displays the gesture in a graphical display, namely a pattern map such as the ones shown in Figs. 9M, 9N, 90, 9S, 9T, and/or textual mode depending on its current mode of operation. In a further embodiment, the knitter may switch between graphical and textual display and guidance of the knitting status and action on the display of the digital platform. The platform will accordingly convert the graphical symbols that indicate the location of the stitch on the grid to the corresponding coordinates in a 3D space. In still another embodiment, the digital platform simultaneously shows graphical and textual presentations of a stitch on the display. Particularly, the platform may show the locations of knitted stitch on the grid and advancement of the knitting process and location of the current stitch to be knitted parallel to the coordinates of the current stitch in 3D space. In another embodiment, the communication means may also be wire communication means, e.g., USB cable. Such wire communication is also configured to enable bidirectional communication between the e-needles and the digital platform and transmitting space location, moves and angles of the e-needles to the platform and commands for particular alerts from the platform to the e-needles.

In one particular embodiment, the digital platform comprises wireless communication means and protocol configured for communication with the wireless means and protocol mounted on the e-needles in combination with the orientation sensors, memory means for storing digital data thereon and control means. Accordingly, communication between the e-needles and digital platform is bi-directional. This enables to transmit data from the needle relating to the orientation of the needles in a 3D space to the digital platform and signal on wrong orientation of the needles in forming a new stitch. A warning alarm may then set off in response to the signal the platform sends.

In this respect it should noted that the communication between the e-needles and the digital platform is not limited to any particular means or protocol. For example, different wireless communication protocols may be used when wireless means is implemented in the communication between the e-needles and the digital platform. Preferable by not exclusive protocols are WiFi, Bluetooth and IEEE. Other wireless communication protocols and corresponding wireless communication networks, which may be applicable for communication between the e-needles and the digital platform, are GSM, CDMA, UMTS, CDMA2000, TD-SCDMA, GPRS, EDGE, Bluetooth, IEEE 802.11, IEEE 802.15.3, IEEE 802.15.4, IEEE 802.16, IEEE 802.20, IEEE 802.22, DECT, WDCT, UMA, HIPERLAN, BRAN, HIPERMAN.

In one particular embodiment, the e-knitting system is configured to operate in two optional modes: A Recording mode: knitting gestures are sent to the digital platform as an editable and sharable code and appear on screen as a pattern map, knitting chart or text

instructions.

A playback or a guiding mode: Light, sound and/or vibrations indicate to the knitter what type of stitch she should be knitting next based on a chosen pattern on the digital platform. Another option is visual on screen display of a signal. Such signal may be in the form of a flashing or blinking icon at the current location of a stitch in the graphical 2D presentation of the knitting pattern. The icon may be a particular symbol and/or color for a corresponding status or state of a knitting action in a knitting session. For example, the icon may show an 'X' sign at the proper location on the graphic 2D screen display to indicate a knitting error. Other signs may designate different states and status in a knitting session. Further, the sign may be colored with any selected color according to a predetermined color code. Alternatively or additionally, the signal may be in the form of a text message appearing on screen that indicates the particular state or status in the knitting session, for example a knitting error that the knitter makes or a type of stitch. The signalling of state or status may further be distinguished by assigning different symbols and/or colors to different states, for example different types of knitting stitches, knitting errors, knitting states, level of progress in the knitting session and speed of knitting. Rating performance in a knitting session may also appear on screen in graphic and/or text modes, which will weigh all relevant variables that determine it.

The connectivity to the digital platform can be done by a wireless connectivity means using related protocols such as Bluetooth, Wi-Fi, IEEE and other wireless optional protocols. Alternatively, communication of the e-needles and the digital platform may be with wire communication means. For example, the e-needles may be directly connected to the digital platform with USB cable connectivity. Other wire

connectivity means may be used to communicate space coordinates and movement data from the e-needles to the digital platform. The bi-directional communication between the e-needles and the digital platform also enables the wire communication to transmit commands to indicate a particular state during e-knitting to the controller that controls the signalling components in the e-needles. For example, the digital platform is configured to diagnose a particular position of the e-needles, analyze the progress in the knitting process relative to the knitting scheme, which is elected, identify an error in the movement or position in space of the e-needles or a stitch made and accordingly send a command to the controller in the e-needles to set a visual alarm off with a light bulb with a selected particular color. For example, the light bulb may flash in red color when set off.

While monitoring of the dynamic change of location and movement of the e-needles is done with space location identification means, other means for identifying their location and monitoring movement are contemplated within the scope of the present invention. A non-limiting example is a smart glove, which may be worn on the knitter hand and monitor the change in palm and finger movement in the knitting actions.

The glove itself may comprise means for identifying and tracking location in space of the hand, which is similar to the means mounted on the e-needles. Tracking hand movement may provide exclusive or supplemental information on the relative or absolute position of the e-needles and determine if the knitting action is progressing correctly or not. Another non-limiting example is a camera, which is configured to visually monitor the knitting action and location change and movement of the needles and transmit digitally encoded visual information to the digital platform. The platform is configured to receive such visual data, decode and analyze it and accordingly retrieve the required information on the state and location of the needles and the knitting process. In one particular embodiment, the camera may be stills camera which is programmed to take picture at a selected frame speed, for example one frame per second. In a second particular embodiment, the camera may be a video camera that is configured to continuously monitor a knitting session. The visual information from the camera stills or video, may be stored in the digital platform until the end of any selected knitting session. Alternatively, the photos and/or video film may be transmitted to a particular database in a data space allocated for the particular knitter on a data storage cloud.

Digital Platform: The digital platform may be selected from any type of digital platform such as mobile, tablet, laptop, desktop, industrial knitting machine and AR glasses.

Figs. 2 and 3 illustrate assembled and exploded views of two types of pairs of knitting e-needles (la, la') and (2a, 2a') with sensors, alarm means and wireless communication means mounted on them. The e-needles (la, la') shown in Figs. 2 and 4A, are single-pointed with the sensors, alarm and wireless communication means imbedded within their proximal heads (lb, lb'), i.e. knobs, opposite the distal end, namely the tips (le, le'). The e-needles proximal heads (lb, lb') are screwed into e- needle body parts (Id, Id') at their corresponding proximal sides (lc, lc'). Figs. 3 and 5A show another type of a pair of circular needles (2a, 2a') connected to each other with a wire (2f) and with the sensors (2b, 2b') and alarm and wireless communication means imbedded within their proximal ends (2e, 2e'). The needles can be fabricated from various types of materials such as metal, plastic, wood and bamboo or other types of materials, with the required mechanical properties, such as stiffness and flexibility, for various types of knitting functionalities. Such needles are also configured to form appropriate and reliable housing for the components embedded in them such as sensors, alarm means, controllers and wireless communication means. The e-needle in Figs. 2 and 3 illustrates a particular implementation of the concept of e-needles in the present application. These Figures show means embedded within the needles that make digital communication and monitoring and guidance of a knitting process possible. In still another particular embodiment, the present invention provides apparatus for monitoring, determining location in 3D space and guiding needles in a knitting action, where this apparatus is configured to be mounted on idle or regular e-needles and transform them to e-needles with the capabilities and functionalities detailed above. This apparatus may be in a form of a clip with fastening means to hold it to the body of the needles in any selected position and place. Alternatively, the apparatus comprises magnetic fastening means for attaching it to needles, which are made of metallic and/or magnetically attracting material.

As further shown in the Figs. 2 and 3 in the zoom-in illustrations on the single pointed and circular needles, respectively, the screwed and unscrewed knob head (lb, lb') is particularly configured to accommodate the sensor, controller, alarm and wireless communication means and battery in appropriate inner space and slot, respectively.

The sensors are plugged and can be accommodated to various needles with different geometrical shapes and sizes, and can be mechanically attached to the bottom part of the knitting needles (le, le') via a glue, clips, screws, magnetic attachment means, a combination of a single or a plurality of one or several means of all previous mechanical attachment means or any other different mechanical attachment mechanism.

The sensors move together with the needle and respond accordingly, thereby enabling the monitoring of the actual movement of the needle in real-time and recording it for future guidance, real-time monitoring of a knitting process, real-time error alarm and onsite correction of errors. The particular alarm may be selected for each needle, distinguishing between errors in the movement of each needle that the user makes. The recorded gestures can be translated into knitting instructions in several possible optional formats such as:

- Hand knitting pattern map, charts, texts and instructions.

- Industrial knitting machine code and pattern map.

As shown in Figs. 4-5, in both types of e-needles the sensors monitor the user gesture and transmit it to the digital medium through wireless communication system and related signal (4). Then the digital medium (3a, 3b, 3c, 3d) translates the gesture to a corresponding code and records it for replay in a future knitting session or displays it graphically and/or as text on screen for the user who knits in real-time. Such digital medium illustrated in Figs. 4-5, can be a user smartphone or other mobile, tablet, laptop, desktop, industrial knitting machine, AR glasses or any other wearable device with recording, computing and digital data storing capabilities and wireless communication with the particular sensors in the e-needles. When in guiding mode, the apparatus of the present invention may guide human user or industrial knitting machines, keeping the principles of guidance the same. Thus, the apparatus and method of e-knitting of the present invention facilitate industrialization of customized knitting, enabling faster and mass manufacturing of knitting patterns designed by particular users.

Figs. 6 through 8 are flow diagrams detailing the steps for monitoring needles movement, communicating to the digital platform, recording, displaying, guiding a user in real-time, identifying errors in needle movement and setting alarm off for correcting needle movement during knitting. Particularly, Fig. 6 is a flow diagram (100) of the initial setting up of the apparatus for operation with the steps (110-160). First, the application is uploaded and initiates search for wireless communication with the sensors imbedded in the e-needles. Further steps are then allowed, when the e- needles are identified and communication is set up with them. Fig. 7 displays flow (200) with the particular steps (205-250) for recording movement of the needles and storing the particular gestures for every type of knitting stitch. As specified in Fig. 4, the sensors are selected from gyroscope, accelerometer and magnetometer and in general any type of location and orientation sensor that can monitor movement in 3D (three-dimensional) space and any combination thereof, in this case movement of the needles. The digital medium begins recording the needle gestures, where every gesture is identified by name that relates to the particular movements required to form a particular knitting stitch (205). Accordingly, the digital medium waits for raw data of a movement event sent from the needles, i.e., the sensors mounted on the needles (210). Knitting may be done in still or moving positions and in addition in static or moving reference spaces. For example, sitting on a chair at home is a static position in static reference space and sitting in a riding train is a static position in a moving reference space. Therefore, the sensors mounted on the needles capture the raw movement of the needles, which requires cancelling off of the movement of the reference space. Accordingly, for proper gesture recognition or recording, the digital medium calculates the differential motion and/or orientation of the needles in the reference space, and removes background motion and/or orientation. This is carried out by using the device that records the movement of the needles as reference, because it too travels with the surrounding space or stays in static position (215). The device can be a user smart phone or other mobile, tablet, laptop, desktop, industrial knitting machine, AR glasses or any other wearable device with recording, computing and digital data storing capabilities and wireless communication with the particular sensors in the e-needles. The differential data is then recorded until completion of the gesture or the off button is pressed (225) and the digital medium stores the data and time stamps it (230). The stored data of the particular gesture is processed, saved and tagged with user id and gesture name (235-245). The digital medium then returns to recording mode until complete turning off of the process (250).

Fig. 7 details the particular use of nine DOF (Degrees Of Freedom) IMU (Inertial Measurement Unit) data event received from the needles to initiate a session of recording knitting gesture or guiding a knitting session, steps (215-220). Essentially, the DOF totals the number of axes and sensors combined together for balancing any plane, in this case the plane of operation of the needles. The IMU essentially combines 3D space location and orientation sensors including accelerometers, gyroscopes and sometimes magnetometers all combined together to provide dynamic information on the needles location and orientation in space. In one particular application, the 9 DOF is a 6 DOF with magnetometer, namely a compass.

Fig. 8 details the flow (300) of gesture recognition for guiding a user during a knitting process that follows recorded knitting gestures. As with the recording process in Fig. 7, the digital medium receives raw data from the sensors imbedded in the needles (305) and cancels the movement of the reference space in which the needles move

(310-315). The differential motion and/or orientation of the e-needles are identified as a particular gesture (320) when compared with the recorded gesture database (325). The digital medium then concludes if actual gesture is detected (330). A positive answer will cause the display of the formation of the gesture on screen and indicate that a gesture event has taken place. A negative answer will set the error alarm off and return to the start for gesture recognition (340-350). Regardless of knitting errors identified by the digital medium as described above, the apparatus of the e-needles and digital medium/platform provides the e-knitter with continuous walk-through guidance of the guiding session. Accordingly, the digital medium instructs the e- knitter which stitch should be knitted at every point in the knitting session. This enables even non-experienced knitters to knit different stitches and patterns with the close monitoring and guidance of the digital platform of the apparatus of the present invention. Figs. 9A-Y show the sequence of screens of the app operating on the digital medium and the actions taken when e-knitting with the apparatus of the present invention. The front start and sign in and sign up screens are required in every knitting session, Figs. 9A-C. The recording of the knitting gestures may then follow, receiving the gesture dynamics as gesture movements in 3D coordinates, cancelling movement of the reference space in which the needles move and translating it to corresponding app code, Fig. 9D. Every type of stitch is assigned an id as seen in the screens in Figs. 9E- G, CO standing for Cast On stitch, K for knit stitch, P for purl stitch, Mp for Mop stitch and GR for Garter stitch. Other types of stitches may be monitored, recorded, translated to computer code and identified with proper id. The recording may be graphical or textual for replay and display on-screen for e-knitting user that follows the instructions for knitting the different types of stitches in any selected pattern. Appropriate app screens for these actions are shown in Figs. 9H-J, and to which the e-knitter may move when knitting with the sensor mounted e-needles and

corresponding app. Figs. 9K-T show screens of visual 2D mapping of a knitting progress of a particular pattern, a sweater in the case shown. The screens Figs. 9K-0 show the process of recording an entire session of knitting in the order of the stitches knitted. Every stitch is identified by its type and coordinates relative to neighbour stitches and to the borders of the pattern that is knitted. The knitting and recording may be stopped and resumed at any point during the knitting session. A particular pattern may be selected from a database for monitoring, identification or recognizing and recording. Figs. 9P-T of on-screen 2D mapping presents the progress of e- knitting by a user that follows knitting instructions for particular pattern. The sensors receive haptic feedback from the digital medium and visually signal the e-knitter on the type of the next stitch for example with visual light indication. Such visual or audible signalling may be used also to signal on errors. In playback mode, the e- knitter may introduce the desired size for any selected pattern. The size will then translate to the number of stitches required and the display and playback of the base pattern will adapt to that size, as exemplified in the screens in Figs. 9U-X. The e- knitter may also view useful and instructive information for the particular knitting project, which helps her elect one including the knitting instructor. Such information may be the rating of knitting instructors, expertise, level of knitting of a particular item, amount of thread for knitting that item, average work hours for completing the knitting and the money invested in purchase of materials for knitting. As shown in Fig. 9Y, in several steps along the recording knitting session, the whole project can be edited, advised with an expert, added with photos and comments and exported, shared or offered to sell online to other users. Finally, Fig. 9Z summarizes a full illustration of the sequence of screens of the knitting operating app on the digital medium with some major interrelations between the different screens. The knitter is walked through a knitting session by passing from one screen to another on the digital platform. Each screen makes part of a sequence of screens so formed to properly guide the knitter through the possibilities of action, which are made available according to the particular stage that the knitter is at and the steps she takes during the session.

Enlarged views of these screens are shown in Figs. 9A-X.