Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR WRITING AND EDITING COMMON MUSIC NOTATION
Document Type and Number:
WIPO Patent Application WO/2017/195106
Kind Code:
A1
Abstract:
A computerized system for writing and editing common music notation, comprising: an electronic device comprising display means and input means; and a music score editing software running on said electronic device, said software comprising: user interface comprising user interface means configured to receive user stroke gestures in a time independent manner, in relation to a displayed music score; classifying means configured to classify said user stroke gestures; and rendering means configured to render said classified user stroke gestures on said displayed music score, wherein said classifying means are configured to classify each single user stroke gesture as one of: adding score symbols to said displayed score; editing score symbols in said displayed score; selecting score symbols in said displayed score; and deleting score symbols from said displayed score.

Inventors:
SHACHAM ALON (IL)
Application Number:
PCT/IB2017/052693
Publication Date:
November 16, 2017
Filing Date:
May 09, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SHACHAM ALON (IL)
International Classes:
G06F3/01
Domestic Patent References:
WO2014080191A12014-05-30
Foreign References:
US20150228259A12015-08-13
US20140083279A12014-03-27
US20100288108A12010-11-18
Attorney, Agent or Firm:
FOGEL, Ronny (IL)
Download PDF:
Claims:
CLAIMS

1. A computerized system for writing and editing common music notation, comprising:

an electronic device comprising display means and input means; and

a music score editing software running on said electronic device, said software comprising:

user interface comprising user interface means configured to receive user stroke gestures in a time independent manner, in relation to a displayed music score;

classifying means configured to classify said user stroke gestures; and rendering means configured to render said classified user stroke gestures on said displayed music score,

wherein said classifying means are configured to classify each single user stroke gesture as one of:

adding score symbols to said displayed score;

editing score symbols in said displayed score;

selecting score symbols in said displayed score; and

deleting score symbols from said displayed score.

2. The computerized system of claim 1 , wherein said user stroke gestures comprise at least one of:

continuous stroke relative to an empty part of the score;

continuous stroke relative to a previously added score symbol or symbols; swiping;

tapping or double tapping;

dragging;

panning;

zooming;

pressing; and

drawing a circle around a group of scores.

3. The system of claim 1 , wherein said score editing software further comprises a score debugger configured to continuously analyze the score for problematic areas, indicate detected issues and suggest solutions.

4. The system of claim 1 , wherein said score editing software further comprises an arrangement module configured to arrange the score symbols on the score in a relative way to their duration and by professional score rules, add missing measure bars and add missing rests.

5. The system of claim 4, wherein said arrangement module is further configured to perform auto fit for editing in real time.

6. The system of claim 1 , wherein said user interface further comprises multifunctional buttons.

7. The system of claim 6, wherein said multi-functional buttons are configured to perform different functions when touched in different manners.

8. The system of claim 1 , wherein said user interface further comprises at least one draggable button.

9. The system of claim 8, wherein said at least one draggable button comprise at least one of :

playback button;

record button; and

auto fix button.

10. The system of claim 9, wherein said playback button is configured to play back a selected part of said music score, said selected part determined by said user stroke gesture.

1 1. The system of claim 9, wherein said record button is configured to record music from at least one of:

a digital source;

a virtual instrument; and

an audio input.

12. The system of claim 9, wherein said auto fix button is configured to automatically arrange a selected part of said music score and automatically solve trivial score issues in the said selected part of said music score, said selected part determined by said user stroke gesture.

13. The system of claim 1 , wherein said classifying means are configured to classify said user stroke gestures according to gesture type, position on score and current state of said software's user interface.

14. A computerized method of writing and editing common music notation on an electronic device comprising display means and input means, using a score editing software, comprising:

displaying a music score on an electronic display screen;

receiving user stroke gestures in relation to said displayed music score;

classifying said user stroke gestures; and

rendering said classified user stroke gestures on said displayed music score, wherein said classifying comprises classifying each single user stroke gesture as one of:

adding score symbols to said displayed score;

editing score symbols in said displayed score;

selecting score symbols in said displayed score;

deleting score symbols from said displayed score; and

unrecognized user stroke gesture.

15. The computerized method of claim 14, wherein said user stroke gestures comprise at least one of:

continuous stroke relative to an empty part of the score;

continuous stroke relative to a previously added score symbol or symbols; swiping;

tapping or double tapping;

dragging;

panning;

zooming;

pressing; and drawing a circle around a group of scores.

16. The method of claim 14, further comprising continuously analyzing the score for problematic areas, indicating detected issues and suggesting solutions.

17. The method of claim 14, further comprising arranging the score symbols on the score in a relative way to their duration and by professional score rules, adding missing measure bars and adding missing rests.

18. The method of claim 17, further comprising performing auto fit for editing in real time.

19. The method of claim 14, wherein said score editing software further comprising using at least one draggable button to perform at least one of:

playback;

recording; and

auto fixing.

20. The method of claim 19, wherein said playback comprises playing back a selected part of said music score, said selected part determined by said user stroke gesture.

21 . The method of claim 19, wherein said recording comprises recording music from at least one of:

a digital source;

a virtual instrument; and

an audio input.

22. The method of claim 19, wherein said auto fixing comprises automatically arranging a selected part of said music score and automatically solving trivial score issues in said selected part of said music score, said selected part determined by said user stroke gesture.

23. The method of claim 14, wherein said classifying said user stroke gestures comprises classifying according to gesture type, position on score and current state of said software's user interface.

24. One or more computer-storage media embedded with computer-executable instructions, the embedded computer-executable instructions are executed by at least one processor for performing a method of writing and editing common music notation on an electronic device comprising display means and input means, using a score editing software, comprising:

displaying a music a score on an electronic display screen;

receiving user stroke gestures in relation to said displayed music score;

interpreting user stroke gestures; and

rendering said interpreted user stroke gestures on said displayed music score.

25. The system of claim 1 , wherein said classifying means are configured to classify each said user's single stroke using at least one of:

said stroke's size and ratio;

said stroke's proximity to other score symbols on the score;

said stroke's vector of directions; and

said stroke's normalized and transformed shape.

26. The method of claim 14, wherein said classifying comprises classifying each said user's single stroke using at least one of:

said stroke's size and ratio;

said stroke's proximity to other score symbols on the score;

said stroke's vector of directions; and

said stroke's normalized and transformed shape.

27. The system of claim 1 , further comprising grading means configured to grade said classification results.

28. The method of claim 14, further comprising grading said classification results.

29. The system of claim 1 , further comprising means for aggregating multiple of said single user strokes into multiple stroke score symbols.

30. The system of claim 29, wherein said aggregated strokes comprise gestures unrecognized by said classifying means.

31. The method of claim 14, further comprising aggregating multiple of said single user strokes into multiple stroke score symbols.

32. The method of claim 31 , wherein said aggregated strokes comprise gestures unrecognized by said classifying.

33. The system of claim 1 , wherein said classifying means comprise means for cleaning and normalizing said user stroke gestures input data.

34. The system of claim 33, wherein said means for cleaning comprise means for trimming the beginning of the stroke.

35. The system of claim 33, wherein said means for normalizing comprise means for running a spline interpolation on the touch data's touch points.

36. The method of claim 14, wherein said classifying comprises cleaning and normalizing said user stroke gestures.

37. The method of claim 36, wherein said cleaning comprises trimming the beginning of the stroke.

38. The method of claim 36, wherein said normalizing comprises running a spline interpolation on the touch data's touch points.

39. The system of claim 27, wherein said grading means comprise machine learning means for creating grading formulas.

40. The method of claim 28, wherein said grading comprises using grading formulas created by using machine learning.

41. The system of claim 1 , wherein said electronic device's display is a touchscreen display.

42. The system of claim 1 , wherein said electronic device's display is not a touchscreen display, further comprising a second electronic device having a touchscreen display and user interface means, said second electronic device synchronized with said electronic device.

43. The system of claim 42, wherein said score is displayed on both said electronic device and said second electronic device and wherein said gestures are made on said electronic device's user interface means and synchronized with said second electronic device.

44. The system of claim 42, wherein said score is displayed on both said electronic device and said second electronic device, further comprising means for selecting an area to be edited on the score displayed on said electronic device display and means for displaying said selected area on said second electronic device display wherein said editing is done on said second electronic device.

45. The system of claim 42, wherein said score is displayed on both said electronic device and said second electronic device and wherein said gestures are made on said second electronic device's user interface means and synchronized with said electronic device.

46. The system of claim 1 , wherein said electronic device's display is not a touchscreen display and wherein said electronic device further comprises a trackpad.

47. The system of claim 46, wherein said gestures are made on said trackpad, further comprising means for displaying said trackpad contour on said displayed score, wherein said gestures are made on said trackpad.

48. A method of locking a computer screen comprising:

displaying a login screen comprising staves; and

receiving a password from a user, said password comprising music notes drawn on said staves.

49. The method of claim 48, wherein said music notes are selected from the group consisting of: a single bar, a single staff and multiple staves, with one or more notes and rests on them.

Description:
METHOD AND SYSTEM FOR WRITING AND EDITING COMMON MUSIC NOTATION

FIELD OF THE INVENTION

The invention is related to associative shape and handwriting recognition in music score editing interfaces.

CROSS-REFERENCE TO RELATED PATENT APPLICATIONS

This patent application claims priority from and is related to U.S. Provisional Patent

Application Serial Number 62/333,284, filed 9 May 2016, this U.S. Provisional Patent Application incorporated by reference in its entirety herein.

BACKGROUND OF THE INVENTION

Writing, creating and editing music notation sheets can be achieved by writing using a pencil or pen on a music score paper or by using one of several score editing software applications available.

Using today's score editing PC software applications and web applications is not an easy task. The user is required to use a keyboard and/or mouse and/or a connected Midi device (e.g. a digital piano keyboard) and also gain a lot of knowledge and high level expertise in operating the software in order to write basic music notation and perform basic editing functions. Normally writing music on these applications is slow and in order to accelerate this process, the user has to learn software specific keyboard shortcuts. Most of today's touch based software applications mimic the PC notation software interfaces and rely mainly on menus to pick notation symbols from to add to the score and other menus to change existing notation symbols. These solutions are slow and cumbersome and require a long learning curve by the user in order to accomplish basic music notation writing and editing functionality. Other, more recent touchscreen based notation applications, let the user write notation symbols using his finger and/or a stylus on a touchscreen in his handwriting and after triggering the application the application uses OCR (Optical Character Recognition) technology to identify the handwriting and transform it into digital notation. These solutions lack giving the user immediate audio feedback for the notes written, and so are not fit for novice musicians as they require the user to know in advance how the notes will sounds.

There are other experiments and patents on handwriting recognition. These generally include having a plurality of strokes and then combining and recognizing them in relation to legal music notation rules. In a manner of speaking, using this kind of recognition, a writer has to know in advance exactly what he's about to write.

None of the software applications described above give the user the feeling of ease and simplicity similar to writing music notation using a pen and paper, and as a result, may block the user's creative flow.

SUMMARY OF THE INVENTION

The present invention is based on musical staves that are presented on an interactive display, on a scrollable canvas view or otherwise, as a musical score (Hereinafter called the score). Staves on the score are used as reference for the drawing and editing notation symbols to the score.

The system attempts to recognize every single gesture the user strokes on the score as a music notation symbol or part of it. The recognition process relates to other notation symbols already on the score (Hereinafter called score symbols), meaning that existing score symbols may change to adjust to any newly added stroke shape. A stroke can also be recognized as a score editing shortcut (e.g. drawing a circle around score symbols to select them, drawing a squiggly line to erase score symbols) and also, previously unrecognized strokes may be used for a multi stroke symbol recognition. If the recognition was successful, the proper symbol or part of symbol will be added to the score, the surrounding score symbols will be rearranged and the score symbol's notes will be immediately played back through an audio output unit or any digital protocol like MIDI if available. In case the recognition was not successful the single stroke gesture is saved for future multiple stroke pattern recognition.

According to a first aspect of the present invention there is provided a computerized system for writing and editing common music notation, comprising: an electronic device comprising display means and input means; and a music score editing software running on said electronic device, said software comprising: user interface comprising user interface means configured to receive user stroke gestures in a time independent manner, in relation to a displayed music score; classifying means configured to classify said user stroke gestures; and rendering means configured to render said classified user stroke gestures on said displayed music score,

wherein said classifying means are configured to classify each single user stroke gesture as one of: adding score symbols to said displayed score; editing score symbols in said displayed score; selecting score symbols in said displayed score; and deleting score symbols from said displayed score.

The user stroke gestures may comprise at least one of: continuous stroke relative to an empty part of the score; continuous stroke relative to a previously added score symbol or symbols; swiping; tapping or double tapping; dragging; panning; zooming; pressing; and drawing a circle around a group of scores.

The score editing software may further comprise a score debugger configured to continuously analyze the score for problematic areas, indicate detected issues and suggest solutions.

The score editing software may further comprise an arrangement module configured to arrange the score symbols on the score in a relative way to their duration and by professional score rules, add missing measure bars and add missing rests.

The arrangement module may further be configured to perform auto fit for editing in real time.

The user interface may further comprise multi-functional buttons.

The multi-functional buttons may be configured to perform different functions when touched in different manners. The user interface may further comprise at least one draggable button.

The at least one draggable button may comprise at least one of: playback button; record button; and auto fix button.

The playback button may be configured to play back a selected part of said music score, said selected part determined by said user stroke gesture.

The record button may be configured to record music from at least one of: a digital source; a virtual instrument; and an audio input.

The auto fix button may be configured to automatically arrange a selected part of said music score and automatically solve trivial score issues in the said selected part of said music score, said selected part determined by said user stroke gesture.

The classifying means may be configured to classify said user stroke gestures according to gesture type, position on score and current state of said software's user interface.

The classifying means may be configured to classify each said user's single stroke using at least one of: said stroke's size and ratio; said stroke's proximity to other score symbols on the score; said stroke's vector of directions; and said stroke's normalized and transformed shape.

The system may further comprise grading means configured to grade said classification results.

The system may further comprise means for aggregating multiple of said single user strokes into multiple stroke score symbols.

The aggregated strokes may comprise gestures unrecognized by said classifying means.

The classifying means may comprise means for cleaning and normalizing said user stroke gestures input data.

The means for cleaning may comprise means for trimming the beginning of the stroke. The means for normalizing may comprise means for running a spline interpolation on the touch data's touch points.

The grading means may comprise machine learning means for creating grading formulas. The electronic device's display may be a touchscreen display.

The electronic device's display may not be a touchscreen display, and the system may further comprise a second electronic device having a touchscreen display and user interface means, said second electronic device synchronized with said electronic device.

The score may be displayed on both said electronic device and said second electronic device and said gestures may be made on said electronic device's user interface means and synchronized with said second electronic device.

The score may be displayed on both said electronic device and said second electronic device, and the system may further comprise means for selecting an area to be edited on the score displayed on said electronic device display and means for displaying said selected area on said second electronic device display wherein said editing is done on said second electronic device.

The score may be displayed on both said electronic device and said second electronic device and said gestures may be made on said second electronic device's user interface means and synchronized with said electronic device.

The electronic device's display may not be a touchscreen display and said electronic device may further comprise a trackpad.

The gestures may be made on said trackpad, and the system may further comprise means for displaying said trackpad contour on said displayed score, wherein said gestures are made on said trackpad.

According to a second aspect of the present invention there is provided a computerized method of writing and editing common music notation on an electronic device comprising display means and input means, using a score editing software, comprising: displaying a music score on an electronic display screen; receiving user stroke gestures in relation to said displayed music score; classifying said user stroke gestures; and rendering said classified user stroke gestures on said displayed music score, wherein said classifying comprises classifying each single user stroke gesture as one of: adding score symbols to said displayed score; editing score symbols in said displayed score; selecting score symbols in said displayed score; deleting score symbols from said displayed score; and unrecognized user stroke gesture.

The user stroke gestures may comprise at least one of: continuous stroke relative to an empty part of the score; continuous stroke relative to a previously added score symbol or symbols; swiping; tapping or double tapping; dragging; panning; zooming; pressing; and drawing a circle around a group of scores.

The method may further comprise continuously analyzing the score for problematic areas, indicating detected issues and suggesting solutions.

The method may further comprise arranging the score symbols on the score in a relative way to their duration and by professional score rules, adding missing measure bars and adding missing rests.

The method may further comprise performing auto fit for editing in real time.

The score editing software may further comprise using at least one draggable button to perform at least one of: playback; recording; and auto fixing.

The playback may comprise playing back a selected part of said music score, said selected part determined by said user stroke gesture.

The recording may comprise recording music from at least one of: a digital source; a virtual instrument; and an audio input.

The auto fixing may comprise automatically arranging a selected part of said music score and automatically solving trivial score issues in said selected part of said music score, said selected part determined by said user stroke gesture.

Classifying said user stroke gestures may comprise classifying according to gesture type, position on score and current state of said software's user interface.

Classifying may comprise classifying each said user's single stroke using at least one of: said stroke's size and ratio; said stroke's proximity to other score symbols on the score; said stroke's vector of directions; and said stroke's normalized and transformed shape.

The method may further comprise grading said classification results.

The method may further comprise aggregating multiple of said single user strokes into multiple stroke score symbols. The aggregated strokes may comprise gestures unrecognized by said classifying.

Classifying may comprise cleaning and normalizing said user stroke gestures.

Cleaning may comprise trimming the beginning of the stroke.

Normalizing may comprise running a spline interpolation on the touch data's touch points.

Grading may comprise using grading formulas created by using machine learning.

According to another aspect of the present invention there is provided one or more computer-storage media embedded with computer-executable instructions, the embedded computer-executable instructions are executed by at least one processor for performing a method of writing and editing common music notation on an electronic device comprising display means and input means, using a score editing software, comprising: displaying a music a score on an electronic display screen; receiving user stroke gestures in relation to said displayed music score; interpreting user stroke gestures; and rendering said interpreted user stroke gestures on said displayed music score.

According to another aspect of the present invention there is provided a method of locking a computer screen comprising: displaying a login screen comprising staves; and receiving a password from a user, said password comprising music notes drawn on said staves.

The music notes may be selected from the group consisting of: a single bar, a single staff and multiple staves, with one or more notes and rests on them.

BRIEF DESCRIPTION OF THE DRAWINGS

For better understanding of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings.

With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the accompanying drawings:

Fig 1 provides a basic view of a tablet like device that consists of a context view in which the score is presented;

Fig 2 is a block diagram of the design behind the system's architecture;

Fig 3, Fig 4 and Fig 5 show some examples of new symbols gestures;

Figs 6 through 12 show some examples of change gestures;

Fig 13 shows a selected group of score symbols;

Fig 14 shows an example or an erased quarter note;

Fig 15 is a flowchart showing the steps taken by the stroke classifier;

Fig 16 shows an example of adding an eighth tail stroke to a quarter note, with a stem facing up, from top to bottom;

Fig 17 shows one of the possible ways to maps stroke directions;

Fig 18 shows the need for stroke priorities;

Fig 19 shows two examples of context menus, for notes and for measure bars;

Fig 20 shows in three images a horizontal dragging of a quarter note;

Fig 21 and Fig 22 show an example of a possible interface for the score debugger; Fig 23 shows proper stem directions for quarter note chords with a single note circle each;

Fig 24 shows an example of improper beaming and the same notes under proper note beaming;

Fig 25 shows a small example of arranging a measure by duration;

Fig 26 illustrates a system architecture comprising a touchscreen device and a synchronized second computer having a displayed;

Fig 27 illustrates a system architecture comprising an electronic device having a trackpad; and Fig 28 shows an example of a user interface for using music notes as password for unlocking a computing device.

DETAILED DESCRIPTION OF EMBODIMENTS

Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.

The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.

The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.

Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.

Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.

Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.

These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a

programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.

The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.

The flowchart and block diagram in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.

The present invention provides a system, method and computer readable programming instructions for editing music notations. The system is time-independent in that a user may take as long as required to enter a stroke on the score and the system does not take this time into consideration when classifying the user's stroke, as will be explained in details below. This makes the system suitable for children, grownups, handicapped people etc.

The basic view is shown in Fig 1 , consisting of a tablet like device that consists of a context view in which the score is presented 101.

Alternatively, the application may run on any computer unit that has a physical or virtual or projected display surface presenting the score (Hereinafter the display) and that receives input signals from any type of pointing device such as a finger or a stylus or a mouse or any other virtual or physical finger-simulating device which may detect strokes (Hereinafter finger). The computer unit may be, for example a virtual or augmented reality system projecting a canvas in space equipped with movement sensors tracking the user's palm gestures.

In musical terms a score consists of one or more 'systems', which is the normally recurring occurrence of the stack of all of the score's instruments staves (Hereinafter a repetitions), The score in 103 shows two repetitions of an instrument that has two instrument staves per repetition.

The system model presented in Fig 2 describes in general terms the design behind the system's architecture. The score manager 220 handles the digital representation of the current score file, it's data model, layout data, pages, repetitions, staves and all other score symbols. When the editor is loaded the system uses rendering manager 217 to render score elements on top of the scrollable context view 204 in a memory efficient manner that fits the technical limitations of the computer unit. When using the system the user touches the screen using his finger or fingers generating touch events 201. The touch events are then classified using the input touch classifier 203 and trigger different actions depending on the type and position of the touch events and the current state of the application's user interface.

Panning and zooming related touch events are classified to be handled by the scrolling manager 205 unit that controls the context view's positioning in coordination with the rendering manager 217, to manage the context view's graphical rendering to the display. Touch events that occurred in the context of menus are classified to be handled by the menus & popovers manager 207 that is responsible for managing the display and behaviour of menus and dialogs and their responses to touch events. Adding and editing score symbols is accomplished by drawing simple strokes on the display 202 using a finger. 103 (Fig 1) shows a tablet device with context zoom scale more suitable for editing using a stylus as an input device. Panning and zooming over the canvas is done either by performing multi touch swipe and pinch gestures if multi touch support is available or by using a panning mode and pitch controls triggered either by a virtual controller on the display or by an external synced controller.

The score staves 102 serve as reference for drawing new score symbols.

User's single strokes are categorized by the input touch classifier 203 as potential new symbol gesture if created by applying a continuous stroke relative to an empty part of the score (an empty part means a place on the score where there are no score symbols, disregarding the staff lines). Potential new symbol gestures are handled by the strokes classifier 211 that is responsible for understanding the user's meaning behind his stroke and making the proper changes to the score accordingly.

Fig 3, Fig 4 and Fig 5 show some exemplary symbols that can be recognized under these conditions. The images are arranged in couples, where the left image shows the stroke the user made as a dotted line and the right image shows the added symbol on the score resulting from that stroke. Full note circles (as seen in the right images of 301 and 302) may be added by drawing a full circle 302 or by drawing a shortcut shape (a small diagonal line for example) 301. Automatically adding a stem to the chord's note circle may be configured in an external settings menu (A stem is a line facing up or down from note circles as part of a chord; a chord is a score symbol containing one or more note circles and that has or hasn't got a stem). In some cases adding a score symbol may cause adding score symbols to other staves, e.g. drawing a vertical line over a staff, as in 401 , or staves, will add a measure bar (a vertical line separating measures in the score as seen on the right hand side on the right of 401) that spans through all the repetition's staves for the current time position on the score. Other symbols such as chord harmony symbols (e.g A,B,C,D chords harmony symbols) can be added in a similar fashion. User's strokes are categorized by the input touch classifier 203 as potential changes to an existing score symbol or symbols gestures if created by applying a continuous stroke relative to a previously added score symbol or symbols.

Figs 6 through 12 show some of these possible changes using three images for each situation, where the left image shows the relevant part of the score before applying the stroke, the middle image shows the stroke the user made as a dotted line and the right image shows the relevant part of the score after applying the stroke. Potential changes to an existing score symbol or symbols gestures are handled by the strokes classifier 211 that is responsible for understanding the user's meaning behind his stroke and make the proper changes to the score accordingly. Figs 6 through 10 show some of the possible changes of adding a single stroke to a previously added score symbol. Adding a note circle to a chord, 804, may be done, for example, by drawing one of the three circles strokes, 301 , 302 or 303, close to the position of a chord score symbol on the score.

Adding a duration dot (a duration dot is a symbol added to chords and rests to prolong their duration and can be seen on the right image of 601) to a chord with no duration dots 601 and adding a duration dot to a chord with one or more duration dots 701 may be accomplished similarly, for example by tapping once (stroking a dot) on the righthand side of the score symbol. The same goes for adding a duration dot to all kinds or rest score symbols; 602 is an example of adding a duration dot to an eighth rest (an eighth rest can be seen on the left image of 602) score symbol.

Adding an eighth note tail (an eighth note tail can be seen on the right image of 801 attached to the previously available quarter note, a 16th note tail can be seen in the right image of 802) to a chord without a eighth note tail 801 may be accomplished by drawing a eighth note tail shaped stroke on the right side of a chord that starts or finishes in proximity to the end of that chord's line. Changing an eighth note tail from an eighth tail to a 16th tail 802 or from 16th to 1/32 tail and so on may be accomplished in the same manner.

Changing a rest score symbol ftom an eighth rest to a 16th rest or from 16th to 1/32 rest and so on may be accomplished by drawing an eighth rest shaped stroke on the right above 703 or below 704 rest score symbol.

Adding a flat to a note circle may be accomplished by drawing a flat shaped stroke on the left side of a note circle 901 (a flat can be seen on the right image of 901 near and on the left side of the quarter note). Changing a flat to a double flat may be

accomplished by drawing a flat shaped stroke on the left side of a previously added flat 902.

In some cases changing a score symbol may cause changing score symbols from other staves, e.g. drawing a vertical line that covers a staff near a measure bar, as in 903, will change the measure bar to a double measure bar (a double measure bar can be seen on the right image of 903) that spans all the instruments staves in the current time position on the score.

Fig 10 shows some exemplary changes to a chord's ornaments and articulation marks, that can be achieved using a single stroke.

Figs 11 and 12 show some exemplary connections of several score symbols that can be achieved using a single stroke. For user's gestures that were unrecognized by the system as new score symbols or as an edit to a previously available score symbol or as an editing function like selecting, the user's stroke's trail will remain on the score for either a limited time (or forever, depending on a system setting). Multiple stroke score symbols (e.g. sharps) are recognized by aggregating the user's strokes with proximate saved unrecognized gestures. After recognizing, the system replaces the unrecognized gestures by the multi stroke score symbol.

Adding accidentals (sharp, double sharp, flat and double flat) may also be done by swiping from the centre of a note - up for sharps or down for flats. Each swipe changes the accidental in halftone steps.

The user can select a score symbol by tapping and/or double tapping and/or long pressing it (specifically defined by a system setting) and select a group of score symbols by drawing a circle around them (Fig 13 shows a selected group of score symbols).

Erasing a score symbol or score symbols may be done either by drawing a squiggly line on top of it, (Fig 14 shows an example or a quarter note being erased in this manner) or by selecting the element/s and pressing an erase button from the context menu or drawing a squiggly line on top of the selected area.

To classify the user's strokes the strokes classifier 211 runs a series of mathematical operations, comparisons and queries to the score manager as described in Fig 15. The strokes classifier receives raw touch data 1501 from the system's input touch classifier 203 and normalizes and cleans it 1502 to prepare for classification. During touch data cleanup the system will, for example, trim the beginning of the stroke in case the user accidently drew a tiny line as he put his finger on the display. During touch data normalization the system will, for example, run a spline interpolation on the touch data's touch points, to create a smooth path of data points with equal distance between each point. The strokes classifier 203 classifies the strokes into the different classes of potential results by grading each of the potential results and filtering out irrelevant results and then sorting the results by priority and grade. Filtering and Grading the stroke 1503 is done by using a given list of grading formulas, one for each potential result, built out of boolean expressions and linear combinations of results of

comparisons and queries for score symbols in the score. By collecting usage data from many users 1504 to the remote server(s) 219, the system creates vast amount of training data that is then processed by human experts alongside a machine learning neural network system (such as random forest neural network) that results in an improved list of grading formulas and result priorities list (used later for deciding between results with similar grades) and so the system regularly updates these lists from the server. The same system can work without a remote server, having a machine learning unit on the device itself instead of on a remote server. The system can use different types of features for comparisons and grading, four of the main components of the system's filters and gradings formulas focus on:

1. the stroke's size and ratio 1505

2. The stroke's proximity to other score symbols on the score 1506,

3. the stroke's vector of directions 1507

4. the stroke's normalized and transformed shape 1508 1. The stroke's size and ratio is used to filter potential results by their sheer size and general size ratio. For example, the result of adding a note to a chord using a small diagonal line cannot be achieved with a stroke having a width or height larger than four times the size between the staffs lines. The stroke's size and ratio are used to grade potential results by comparing them to their grading formula's predefined optimal size and optimal ratio values in case these have been predefined.

2. Mapping the stroke's proximity to other score symbols on the score is used to filter potential results that needs to be close to another score symbol or symbols or by themselves, e.g. adding a new chord with a single whole note using a circular gesture cannot be accomplished when the stroke is written on top of another chord. Mapping the proximity to other score symbols is used to grade the stroke by, for example, the distance of the stroke's extremities to other symbols compared to the predefined optimal distance if predefined in the grading formula, e.g. when adding a beam to connect two quarters, the distances from the beginning and ending points of the stroke from the tips of each quarter's stem is expected to be optimally zero.

3. The stroke's vector of directions is used to filter potential results by directions. The vector of directions is built by following the touch data's touch points and whenever a predefined distance threshold is passed, the last leg's direction is calculated and added to the vector. Fig 17 shows one of the possible ways to map directions, the system can also divide the space into more directions if required. To filter results the system runs a series of regular expression comparisons on the stroke's vector of directions (in case they are defined in the result's grading formula)

e.g. 'adding an eight tail to a quarter with a stem heading up, from top to bottom', should contain the following directions by this order- 3,4 and not contain at all 7 and 6's, if these conditions are not met the stroke cannot be an 'adding an eight tail to a quarter with a stem heading up, from top to bottom' stroke. To grade a result that was not filtered we compare the stroke's vector of directions to the optimal vector of directions if given in the grading formula. 4. The stroke's normalized and transformed shape is used to filter and grade potential results by normalizing the stroke to different sizes, transforming the data and running boolean comparisons on the results, as defined in the grading formula. Fig 16 shows an example of adding an eighth tail stroke to a quarter note, with a stem facing up, from top to bottom 1601. As part of the filtering conditions of the result 'adding an eight tail to a quarter with a stem heading up, from top to bottom' grading formula, the stroke's touch data normalized to a 2 X 5 grid 1602 (width X height) should have cells 4 and 6 empty, cells 5 and 7 full, at least one of the cells 8 and 9 should be full, one of the cells 0 and 2 should be full and one of the cells 1 and 3 should be full, and when the stroke's touch data is normalized to a 4 X 4 grid - cells 8, 9, 12 and 13 should be empty, 14 or 15 and 10 or 1 1 full, 0 or 4 full, 6 full and 3 empty. This example in a boolean form:

(((2X5 normalization) & !4 & !6 & 5 & 7& (8 | 9) & (0 | 2) & (1 | 3)) &

((4X4 normalization) & !8 & !9 & !12 & !13 & (14 | 15 ) & (10 | 1 1 ) & (0 | 4) & 6 & !3) The filtering and grading results are sorted by their grade as derived from their filtration grade and only those results that pass a system predefined grade are considered as having passed the filtration.

Results that passed filtering are tested against surrounding, previously added, unrecognized gestures to build multiple strokes symbols 1509, if present. In case unrecognized gestures are present and expect a stroke of the sort of one of the results that passed the filtration, an 'add stroke to unrecognized gesture' result is added to the passed filtration results with a grade depending on how common the multi stroke score symbol is and the amount of missing strokes to finish drawing the multi stroke score symbol. In case only one result passed filtration, or in case the first result in the sorted grade list has a definitive gap over the rest of the results, that result will be treated as a clear cut result 1510. In case of a clear cut result before the user has finished his stroke 1513 the system will suggest the stroke to user as stroke hints for auto completion

1516, otherwise the system will add or change the relevant score symbols 1517 with the help of the score manager 220. In case no results passed filtration 1511 , the system will save the stroke's touch data, for a system defined event, in memory and on the display, to be used for multi stroke shape recognition on following strokes by the user 1512. In case there is more than one result that pass filtration, the system will sort the results based on a predefined stroke priority in combination with the results grade 1515. If the user has not yet finished his stroke 1514, the system will suggest the top results from the sorted list as hints and options for auto stroke completion 1518; in case the user had finished the stroke, the top result from the sorted list will be selected by the system and then used to add or change score symbols 1517 with the score manager 220.

Fig 18 shows the need for stroke priorities; the image on the left 1801 shows two quarter notes with a diagonal line between the tips of their stems. Both the top right image 1802 and the bottom right image 1803 show viable outcomes that the system may arrive at starting from the same user's stroke on the left image. A unique priority value for stroke results is therefore a part of the systems grading formula.

The user can change the properties of a selected score symbol or a group of selected score symbols by using a context menu. Fig 19 shows two examples of context menus, 1901 for notes and 1903 for measure bars. Context menus (e.g. 1902) can be opened by tapping and/or long pressing and/or double tapping a score symbol or by selecting it (specifically defined by a system setting). The touch events are classified as a dragging gesture to be handled by the dragging manager 209, by either initiating a hold and drag gesture over a score symbol or a selected score symbol or score symbols and then dragging them and/or by pressing or/long pressing a score symbol or a selected group of score symbols and then dragging them. The dragging manager is responsible for transposing notes properly during drag, arranging the score around the dragged score symbol and other dragging related issues. Different note symbols react differently to dragging, e.g. a rest score symbol dragged by itself will not respond to vertical dragging but will slide horizontally on the staff in response to horizontal dragging and push other elements on the score to create a temporary empty space on the staff behind it; dragging a chord's note circle vertically will transpose the note, first by adding or removing accidentals and then in a diatonic manner in regards to the current scale and the position of the finger; dragging a chord's circle horizontally will drag the entire chord horizontally, if the drag is to the right it will push all following score symbols on the staff to fit with the dragged element if needed and may add new rest elements behind it if configured to do so by the a system setting (Fig 20 shows in three images a horizontal dragging of a quarter note - 2001 shows the score before the dragging begins, 2002 shows the score after a relatively short dragging distance an eighth rest is added, 2003 shows the score after a relatively longer dragging distance now with the new rest value of a quarter. Using dedicated menus the system lets the user add free form drawings, text and annotations to the score 212.

The local file manager 221 is responsible for periodically saving the score data, managing the user's files, managing file sync & collaboration features with the remote server and remote database units 219, alongside social features.

When the system or the user triggers audio playback of the music written on the score from any point, the audio manager 206 triggers the audio & MIDI sequencer 216 to playback from that point, which triggers the score interpreter 218. The score interpreter scans the score from the beginning to the end and builds a mapping of the score's midi events representing playback in all channels. The score interpreter is responsible for understanding the score in preparation for playback. For example, translating a chord's duration and gain values at every point in time taking into account dynamics, articulations, repeat signs and time signature symbols. The audio & MIDI sequencer times the MIDI events created by the score interpreter and plays them using an audio module containing sound samples and/or sound synthesizing abilities connect to any kind audio output 214 and to the MIDI and other digital music communication protocols output 215. The audio sequencer alongside the rendering manager 217 are also used to export the score to external file formats 213 where the audio sequencer handles the export of the music and the rendering manager handles the export of the graphics. Drawing score symbols freely will, in most cases, leave the score symbols un-arranged by their duration as they are supposed to be on a professional score and also might result in illegal or poorly readable scores (e.g. too many or not enough notes in a measure, notes that should be played at the same time on different staves appear in different places on the x axis ), so we introduce a score debugger 222 and score arranger 223 units. Fig 21 and Fig 22 show an example of a possible interface for the score debugger

The score debugger continuously analyses the score for problematic areas and indicates the issues found to the user in a clickable area 2102 (Hereinafter called the issues indicator) and/or an indication proximate to the relevant score symbols 2101 (e.g. an underline, a bounding box, Hereinafter called the issue indication). The user can tap and/or long press and/or double tap the issues count indicator (specifically defined by a system setting) to see a list of issues the system had found and their description, and when possible, the system will suggest possible solutions for the user to choose from 2103. The user can also tap and/or long press and/or double tap the issue indication (specifically defined by a system setting) to see a description of the issue and when possible, the system will suggest possible solutions for the user to choose from by a single tap. In both cases, after the user picks a solution, the system will fix the relevant part of the score accordingly. The user can also choose to have the system ignore the current issue and treat it as a non issue.

Issues in the score can be of several categories:

1. Number of notes/beats

2. Poor score readability

3. Diatonic fit of notes to a scale

4. Unexpected Chord symbol

5. Missing repeat symbols (missing 'to coda' sign when there's a coda)

To analyze the score, the debugger scans the score from beginning to end, looking for different characteristics per each category. The following are a few examples of some of the issues that the debugger might find for each issue category alongside analysis strategies and some automatic solutions that may be available:

1 . Number of notes/beats issues:

a. Too many/few notes in a measure

Fig 21 shows an example of the way the score debugger indicates the issue (left image), and the suggested solutions for this issues (on the right image)

• Analysis strategy:

o For every measure in the score-

■ Check if the measure begins with a time signature symbol and keep track of changes in the time signature

■ Aggregate the duration of all the notes in the measure

■ Compare aggregated note duration sum to the expected duration from the current time signature value for the measure

o Aggregate the rhythmic patterns of notes durations in all scanned measures that had no issues to offer better options as automatic solutions in following measures issues if needed

• Automatic solution alternatives:

o Too many notes in the measure-

■ Change the time signature of this measure to fit aggregated note duration

■ Move some notes to next measure (recursively as an option)

■ Change the measure's notes duration to fit with one of the previously aggregated rhythmic patterns that has the same amount of notes, if available

■ Change some notes durations to fit aggregated note duration (e.g. make a long note shorter if available)

■ Delete some notes to fit aggregated note duration

o Too few notes in the measure- Change the time signatures around this measure to fit aggregated note duration ■ Add sufficient rest symbols to fit aggregated note duration

■ Change the measure's notes duration to fit with one of the previously aggregated rhythmic patterns that have the same amount of notes, if available

■ Change some notes durations to fit aggregated note duration (e.g. make last note longer)

b. Too many/few notes in a tuplet to fit the tuplet's division (represented by the tuplet numeric value)

Fig 22 shows an example of the way the score debugger handles a tuplet with too few notes under its bridge 2202 or too many notes under its bridge 2203 and some of the automatic solution the system suggests.

• Analysis strategy:

o For every tuplet in the score-

■ Aggregate the durations of all rests and notes under the tuplet's bridge

■ Compare aggregated note duration sum to tuplet's numeric value and makes sure that the sum divided by the numeric value divides without remainder into one of the basic note durations (whole note, half note, quarter note, eighth note, 16th note, 32th note or 64th note).

• Automatic solution alternatives:

o Too many notes or notes durations are too long under the tuplet bridge

■ Remove tuplet bridge

■ Move one or more of the notes that are under the tuplet out of it (in case that fixes the issue)

■ Shorten the duration of the note with the highest duration under the tuplet (in case there is a distinctly longer note and that this solves the issue) ■ Change the duration of all the notes under the tuplet to fit the closest basic duration value to the durations sum divided by the tuplet's numeric value (in case the number of notes and rests in the tuplet is the same as the tuplet's numeric value).

■ Change the tuplet's numeric value to fit notes durations (in case another tuplet value solves the issue)

Too few notes under the tuplet or notes durations are too short

■ Remove tuplet bridge

■ Add rests to fill the tuplet

■ Extend the duration of one or more of the tuplet's notes

■ Change the duration of all the notes under the tuplet to fit the closest basic duration value to the durations sum divided by the tuplet's numeric value (in case the number of notes and rests in the tuplet is the same as the tuplet's numeric value).

■ Change the tuplet's numeric value to fit notes durations (in case another tuplet value solves the issue)

2. Poor score readability - scan score and check every measure for issues such as the following:

· Wrong Stem Direction - in proper music notation, all stems of notes below the middle line that are not connected with a beam, should go up and all stems above the middle line should go down. Notes on the middle line should have their stem go down, except when adjacent notes are opposite. Fig 23 shows proper stem directions for quarter note chords with a single note circle each- notes below the middle of the staff have a stem head up 2301 , notes above the middle of the staff have a stem heading down 2303, notes on the middle of the staff usually have a stem heading down 2302 unless when the surrounding stems are heading up.

• improper note beaming - when scoring, it's important to keep the notes beamed together in the beat of the time signature. Fig 24 shows an example of improper beaming as the first to beat land in the middle of a beam (one after the second chord note from the left and another before the second chord note from the right) 2401 and the same notes under proper note beaming 2402.

3. Diatonic fit of notes to a scale- in modern music theory, it's recommended that notes that are non diatonic to the current scale should not be placed on 'strong beats' (in terms of the music's harmonic pulse)

• Analysis strategy:

o Scan through the score while keeping track of the current scale signature o Compare every note to surrounding strong beats, to the current scale at that point and to the harmony represented by a related chord symbol (if present)

• Automatic solution alternatives:

o Offer to add/remove accidents (in case that solves the issue)

4. Unexpected Chord symbol-

• Missing chord tensions (chord tensions are extension of the basic seventh chord and refer to notes outside the normal harmonic structure of the chord)

o Analysis strategy:

■ Scan through the score while keeping track of the current chord symbol, notes and current position on the music's pulse (the music's pulse consists of beats in a (repeating) series of identical yet distinct periodic short-duration stimuli perceived as points in time occurring at the mensural level and classified beats into strong beats and weak beats and their sub groups).

■ Check every strong beat (in terms of musical pulse) for notes that are not a part of the chord harmony represented by the current chord symbol if available.

o Automatic solution alternatives:

■ Add tensions to the chord ■ Add accidental to current note to fit scale

Chord and scale mismatch

• Analysis strategy:

o Scan through the score while keeping track of the current scale signature

o Check every chord symbol against the current scale. In case the chord harmony represented by the chord symbol isn't a part of the current scale

■ check if the music's pulse including the chord symbol and it's surrounding chord symbols (if available) is a common harmony progression by comparing to a local database of harmonical progressions that is sorted by popularity of use (for example: |II-V|-I, |I-VI|-|II-V|, etc . , where these numerals represent ranks in a scale). For each available harmonical pattern, the system will give a correlation score calculated by comparing all the notes represented in the chord symbol's harmony against that of the chord calculated from the current scale's rank represented in the harmonic progression, that are on the same harmonic pulse position (as a result for example- "Dm7, G| Cmaj7" will have a lower correlation score than "Dm7, G7| Cma7")

In case there is a perfect fit to a pattern there is no issue.

• Automatic solution alternatives:

o Offer an alternative chord if available. Available alternatives are derived by-

■ sorting the results of the harmony's progression correlation array created during analysis by correlation

■ those progressions with a correlation score higher than a system defined threshold are considered viable alternatives 5. Missing repeat symbols

• Analysis strategy: While scanning through the score, keep track of repeat

measure bars, and other repeat signs like coda,to coda segno, etc, while making sure there's consistency in the score's repeat signs compared to music theory. Some examples of available issues:

o missing repeat-from measure bar sign after a repeat-to measure bar sign o missing sengo symbol before a del sengo al code (some of the commonly available repeat related score symbols)

• Automatic solution alternatives:

o Automatically place missing symbol in default positions. The default

positions are determined by the structure of the score and the missing symbol. For example- a missing repeat-from symbol will be placed at the end of the score/staff (depending on the presence of other repetition symbols on the score).

o Guide user to add missing symbol through an interface

Score arranger will be either triggered automatically or by the user performing a dedicated action (depending on system settings) and will arrange the score symbols on the score in a relative way to their duration and by professional score rules. Fig 25 shows a small example of arranging a measure by duration, the top image is before arrangement and the bottom image represents the measure after arrangement by duration. To achieve that, the system performs a series of actions such as the following on a single staff/ repetition/ group of repetitions/ all of the scores repetitions depending on the user's action that triggered the score arranger:

· Review every measure in the repetition from start to finish to check if there are missing notes in the measure compared to the duration derived from the time signature of that measure (similarly or with the help of the Score Debugger)- o Check if the measure begins with a time signature symbol and keep track of changes in the time signature while scanning

o Aggregate the duration of all the notes and rests in the measure o Subtract the aggregated note durations sum from the expected duration derived from the current time signature of the measure to get the duration gap

o Padding - If the duration gap is bigger than zero, fill in the gap by adding to the staff as many rest symbols as needed, of a system configured note duration e.g. eighth rest, either visible to the user or not, to be removed/or not after the arrangement is done, depending on the system's setting

• The system scans the repetition from beginning to end and for each measure and for each instrument staff, the system maps all notes and other on-staff symbols with no duration (symbols that don't have a duration but are associated with a certain beat such as measure bars, key signatures, clef signatures, etc) to their corresponding beat

• For each beat mapped, the system creates a formula to represent the minimum spacing needed to the left (the system can also be implemented to look at the calculation from the right instead in a similar way) of that beat's timing point needed to facilitate the requirements of all the score symbols that were mapped to it (beat's timing point is a point on the score that represents the final placement of the beat and that all score symbols associated with that beat should be aligned with). Spacing formulas are constructed as an arithmetic operator, such as max, min, sum, subtraction, addition, etc, and two objects (operands) to run the operator on, that can be either a numerical scalar size representing width in pixels and a numerical scalar size that represents the note's duration, that will be later multiplied by a variable parameter value- size for quarter; or two other spacing formulas. Spacing formulas are solved by assigning a size for quarter value as the function's parameter. The system determines the beat's left spacing formula by:

o Finding the previous note or rest for each note and rest symbol mapped for that beat (if available) Creating a left spacing formula for each note and rest symbol mapped for that beat, a right spacing formula for each previous note or rest if found, and combining them to one spacing formula using a 'Sum' operand

■ Left spacing for a note may include these and more: accidental width + minimal spacing between accidentals and from note (when it comes to notes), a system defined minimal left spacing setting, measure bar elements and other on staff symbols with no duration sizes and their corresponding surrounding spacing formulas, etc.

■ Right spacing is normally a 'Max' spacing formula that receives a spacing formula representing the note's duration (note duration multiplied by size for quarter) and a spacing formula that includes the sum of the note's circle size, duration dots size and spacing, size of attached symbols (symbols that are dependent on another symbol rather than on a beat, such as articulation and dynamics symbols, lyrics, chord symbols, etc)

■ For example - a single staff with a single quarter note element with a single note circle that has a sharp, a duration dot and a stem facing up and a lyrics track with one word associated with the note will have a left spacing formula: width of sharp + default spacing between accidentals and notes (constant formula with no variable); and a right value of- max between-

• 1.5* size for quarter (the note is 1.5 quarters because of duration dot) + system defined minimal spacing between staff symbols (symbols that are bound to the staff and are serially placed one after the other - notes, rests, measure bars, time/key/clef signatures and such)

• size of note circle + default spacing between notes and

duration dots + width of duration dots + system defined minimal spacing between staff symbols • associated lyrics word width + system defined minimal

spacing between staff symbols

o Combining all the 'Sum' formulas under a 'MAX' operand creating the beat's left spacing formula .

Calculating the optimal size for quarter size for the score symbols to fit optimally in the repetition is done by first passing a system defined default value to the repetition's spacing formula and measuring the result against the available staff size. The spacing formula acts as a sorted integers list and so by doing a binary search (or any other common search approach) the system finds the optimal size value for the size for quarter variable and so determines the position of all the mapped score symbols

A special Score arranger feature- auto fit for editing - runs while the user edits or is playing back the score. The auto fit for editing feature provides automatic flexibility of the locations of the score symbols on the score to allow easy editing and reading of the score, by, for example:

While the user adds notes to a measure or selects a note or group of notes or another system defined user or automatic trigger, the score arranger will create empty score space around the note or notes to enable the convenient addition of at least one more notes following and/or preceding the notes entered or the measure (depending on a system setting).

When the user pans or zooms or triggers another pre defined system setting, the size of the previously edited measure is fitted to the actual notes in the measure.

The system has a special type of buttons (Hereinafter called draggable buttons) that have different ways of functioning depending on how they are used. Three examples of such draggable buttons to be used in the system - Playing back button, Recording button and Auto fix button. Touch events that initiated on a draggable button will be managed by the draggable buttons manager 208 to enable the following type of functionality: Play back button behaviour:

- Long pressing the button will play the entire score from the beginning

- Dragging the button toward the score creates a selection cursor; when the dragging is released the system leaves an indicator on the position of the score where the selection cursor was last at and starts playing back the score from the selection cursor position. In case the user does not release the drag and holds statically for a short system defined period of time, the selection cursor turns to a selection box and releasing the hold will play only part of the score which is selected. If the selection box initiated from user holding his pointing device over the beginning of a staff, the entire staff is selected and if he holds for another system defined short period of time, that staff and all of its following staves for the same instrument will be selected . If the user drags the button over the middle of a staff, the entire measure is selected.

- Tapping the button will start playback from the last position on the score the system played back from. If the user hasn't played back yet, the system will initiate playback from the beginning of the score.

Record button behaviour:

The Record button behaves in the same way the Playback button does but instead of playing back it records from user input using the recording manager 225. The system can record from either digital sources such as MIDI input, a virtual instrument 226 or monophonically from an audio input like a microphone 227.

In case the input originated from a digital signal- quantize the data to fit the bpm better, analyze the recording against the score's bpm and style to approximate chords and rests durations (for example, taking into account swing, can make the difference between triplets where the first two notes are tied and two swing eighth notes), add the recorded elements to the correct place on the score.

In case the input originated from an audio source, the system will basically first run a noise cancellation and compression algorithms on the entire recording to clean the signal, then, for each window in time (the size of which is defined by the system)- the system will run a Fourier transform on the data , choose the frequency with heaviest coefficient and transform it to a pitch in order to identify the leading monophonic pitch in that time point. This process results in a data set similar to that of a digital recording recorded using a software of midi instrument and will continue processing as such. Auto fix button behaviour:

- Long pressing the button will arrange the entire score and will automatically solve trivial score issues raised by the score debugger on the entire score.

- Dragging the button toward the score will create selection boxes. If the user drags the button over the beginning of a staff, the entire staff is selected and if he holds for a short period of time, that staff and all of it's following staves for the same instrument will be selected . If the user drags the button over the middle of a staff, the entire measure is selected.

When the dragging is stopped the system will arrange the selected part of the score and will automatically solve trivial score issues raised by the score debugger in the selected area.

- Tapping the button will auto fix from the last part of the score the system auto fixed using auto fix. If the system or user hasn't auto fixed yet, the system will auto fix the entire score. This system can be used to edit the score using any computer unit that has a display (hereinafter the Computer Unit), whether it has a touchscreen display or not, by using another touchscreen based device in sync with the computer unit (e.g. a tablet, a smartphone device synced by bluetooth, wifi, web protocols or any other communication technology and protocols). Fig 26 illustrates this setup. The user uses the touchscreen based device 2603 in order to add and edit score symbols using strokes, and sees changes immediately on the the Computer Unit's screen 2602. Alternatively, the user can use the Computer Unit's mouse, keyboard and menus to select, copy, paste, duplicate, transpose, edit score symbols with context menus and add elements by selecting from an array of symbols. All changes are immediately synced between the devices. The user may also select an area on the score 2601 using the computer unit's input device, to be displayed and/or edited on the touchscreen device.

This system can be used to edit the score using any laptop computer that has a display and a trackpad surface as shown in Fig 27. The score in projected on the laptop's display on a virtual scrollable canvas view and above it is projected a virtual contour in the shape and size of the laptop's trackpad 2701. The user pans and zooms around the score using two finger swipe and pinch gestures on the laptop's trackpad 2702. To add and edit score symbols using touch gestures, the user enters an editing mode (either by clicking inside the trackpads contour on the display, or in another system defined action by the user). In the editing mode, the touches the user makes on the trackpad are translated as if made on a touch sensitive display in the correlating position on the projected contour of the trackpad on the laptop's display. Score autocomplete - the system can have a score autocomplete unit and functionality 224 to make the act of composing faster. As the user adds a few notes and rests to the score, forming a beginning of a musical phrase, the autocomplete unit compares the rhythmic and pitch changes intervals pattern of the beginning of a phrase with the beginnings of other rhythmical and pitch changes intervals patterns of musical phrases aggregated from the previously added musical phrases in the score, in other scores made by the user and in other score made by other users with the help of the system's servers and looks for the phrases with the highest correlation. The unit will then offer the most relevant phrases to the user, as ways to continue or rewrite his musical phrase in a single tap. To offer only relevant solutions, the system will take care of diatonically transposing the compared music phrases, if needed, to fit the scale of the original beginning of a musical phrase the user wrote.

The system can be used to work as a lock screen for computer systems. Fig 28 shows an example of the main interface of such lock screen. To set a password (notes on the staves represent the password), the user defines a single bar/ single staff/ staves with one or more notes and rests on them as a password. When the user attempts to unlock his device, a screen will appear with a single measure/ single staff/ staves (depending on the user's configured password) and the user needs to enter the notes and rests he defined as a password to unlock it. The actual unlocking will be either done as the user entered his last note in the password or by a user action such as pressing a done button. In case the user forgot his password the user will be able to either reset his password or get a hint for his password in the form of an audio bit (representing a part of the password), or text description or question regarding it.