Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DATA STORAGE DEVICES
Document Type and Number:
WIPO Patent Application WO/2020/240529
Kind Code:
A2
Abstract:
The present disclosure relates to data storage device, more particularly to different types of devices which facilitates in capturing, storing, listing, editing, deleting & displaying and/or playing different types of user data, viz. text-based and/or audio-based and/or visual-based. The devices facilitate with graphical user interface for user interaction. The user data may comprise of logins & password details, notes, codes, keys, bank account details, credit card details, id proof details like social security details, important dates with or without their respective reminders, user dictations, images and/or videos and/or other professional and/or personal and/or generic data which user wants to capture, store and/or access. The devices may further facilitate with features, where features may comprise of features related to security, convenience, usability and accessibility. Further, devices may facilitate in connecting with external devices for facilitating and performing various functions and/or features.

Inventors:
S GUPTA ABHINAV (IN)
Application Number:
PCT/IB2020/057048
Publication Date:
December 03, 2020
Filing Date:
July 26, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
S GUPTA ABHINAV (IN)
International Classes:
G11B27/031; G06F3/00; G06F3/06; G11B20/10; G11B27/10
Download PDF:
Claims:
CLAIMS

1. An apparatus configured to facilitate the user with graphical user interface for capturing, storing, listing, editing, deleting and accessing user data, where user data is either text based user data, or audio based and visual based user data where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data, or text based user data with visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data and visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data; and further may or may not facilitate with one or more features, wherein features may be features related to user convince or security, comprising:

At least one processor configured to execute codes stored in memory.

At least one memory comprising of codes.

At least one circuitry for user interface which facilitates in displaying GUI or, taking user input or, both displaying the GUI and taking user input.

2. An apparatus configured to facilitate the user with graphical user interface for capturing, storing, listing, editing, deleting and accessing user data, where user data is either text based user data, or audio based and visual based user data where visual -based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data, or text based user data with visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data and visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data; and further may or may not facilitate with one or more features, wherein features may be features related to user convince or security, comprising of:

At least one processor configured to execute codes stored in memory.

At least one memory comprising of codes.

At least one circuitry which facilitates in receiving sound -based user input or giving sound-based output, wherein

Sound based input is input for voice command for accessing any feature or a function, or input for user data, or input for user authentication.

Sound based user output is the sound based output of user data or sound-based output of other facilitated features and functions.

3. An apparatus configured to facilitate the user with graphical user interface for capturing, storing, listing, editing, deleting and accessing user data, where user data is either text based user data, or audio based and visual based user data where visual -based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data, or text based user data with visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data and visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data; and further may or may not facilitate with one or more features, wherein features may be features related to user convince or security, comprising of:

At least one processor configured to execute codes stored in memory.

At least one memory comprising of computing codes.

At least one biometric sensor for user authentication where biometric sensor may be for visual based authentication or voice-based authentication or physical contact-based user authentication.

4. An apparatus configured to facilitate the user with graphical user interface for capturing, storing, listing, editing, deleting and accessing user data, where user data is either text based user data, or audio based and visual based user data where visual -based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data, or text based user data with visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data and visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data; and further may or may not facilitate with one or more features, wherein features may be features related to user convince or security, comprising of:

At least one processor configured to execute codes stored in memory.

At least one memory comprising of computing codes.

At least one image sensor for taking visual based input.

5. An apparatus configured to facilitate the user with graphical user interface for capturing, storing, listing, editing, deleting and accessing user data, where user data is either text based user data, or audio based and visual based user data where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data, or text based user data with visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data, or text based user data with audio based user data and visual based user data, where visual-based user data may be image based user data or video-based user data or both image-based and video-based user data; and further may or may not facilitate with one or more features, wherein features may be features related to user convince or security, comprising:

At least one processor configured to execute codes stored in memory. At least one memory comprising of codes.

At least one hardware encryption or hardware security module which facilitates with hardware -based encryption.

6. An apparatus of claim 1-5 where user interface comprises of at least one unit which constitutes as single unit for displaying the graphical user interface and taking user input.

7. An apparatus of claim 1-6 where user interface comprises of touch sensitive unit.

8. An apparatus of claiml-7 where user interface comprises at least one unit which constitutes as separate unit for displaying the graphical user interface and taking user input, wherein

User input may be given by using physical contact-based input device or by using sound-based input device or by using visual based input device or by using external input device.

9. An apparatus of claim 1-7 which comprise a circuitry for taking user input from an interface, which acts as a shortcut key for activating or launching any function or a feature.

10. An apparatus of claim 8 which facilitates in launching or activating a feature or a function on which authorization or authentication method is implemented.

11. An apparatus of claim 1-9 which comprises at least one circuitry to facilitates in wireless charging internal battery or wireless charging external devices or both internal and external.

12. An apparatus of claim 1-10 which comprises at least one circuitry to receive sound-based input.

13. An apparatus of claim 1-11 which comprises at least one circuitry to give sound-based output.

14. An apparatus of claim 1-12, corresponding to claim 8 and claim 9, where audio interface facilitates as user interface for giving user input or receiving sound-based output or both giving user input and receiving sound-based output.

15. An apparatus of claim 1-13 which comprises at least one circuitry for capturing visual based input through an image sensor.

16. An apparatus of claim 1-14, which comprises an interface to connect with external devices, where the external devices are either input devices or output devices or both input output devices or storage devices or computing devices or printing devices.

17. An apparatus of claim 15 which facilitates in transferring data to or from external devices, wherein external devices may be storage devices or computing devices; text based user data can be transferred in other computer readable formats or in

18. An apparatus of claim 14-16 which facilitates in connecting with external visual devices or external audio devices or external display devices or external input devices or external printing devices or external computing devices or external memory devices.

19. An apparatus of claim 1-17, which comprises a sensor module comprising of at least one sensor where sensor is either motion sensor or proximity sensor or light-based sensor, which sense the light.

20. An apparatus of claim 1-18, which comprises at least one light source which facilitate in giving light for at least one facilitated function or feature wherein, light source is used in conjunction with image sensor or biometric sensor or as a flash light or for device status indicaition.

21. An apparatus of claim 1-19, which comprises at least one biometric sensor, for authenticating the user with their biometric prints.

22. An apparatus of claim 1-20 which comprises of at least one memory or a processor to assist with cryptographic protocols for hardware-based encryption or software-based encryption.

23. An apparatus of claim 1-21 where the apparatus is a wearable device which is configured or adaptable to be worn by the user.

24. An apparatus of claim 22, where the apparatus can be worn on wrist.

25. An apparatus of claim 1-22 where the apparatus is a non-wearable device.

26. A method comprising of giving user input, wherein wearable apparatus or non-wearable apparatus comprises:

One or more than one additional input interface unit which facilitates in giving input for launching a function or a feature with or without or both with and without interacting through graphical user interface, wherein the input may be given by using one input unit or by using more than one input unit in combination to give user input

An image sensor for biometric authentication where method comprises: activating or deactivating or launching a function or a feature on which authentication method has been implemented by giving one or more than one user input through one input unit or in combination with one or more then one input unit, such that given input may activate the authentication method to authenticate the user through the said image sensor wherein if the user is an authorized user, user may get authorized and a feature or a function get activated or deactivated or launched; if the user is not an authorized user the said feature or a function do not get activated or deactivated or launched.

Description:
Data Storage Devices

PRIORITY CLAIM

This application claims priority from Provisional application No. 201911020838 filed in India on May 26,2019.

TECHNICAL FIELD

The present subject matter described herein relates to devices which facilitates the user in capturing and storing the data. More particularly, the invention relates to data storage devices for capturing, storing, listing, editing, deleting, accessing the user data.

BACKGROUND

The embodiments of present disclosure relate to the data storage devices. More particularly, the aspects of present disclosure relate to different types of data storage devices for capturing, storing, listing, editing, deleting, accessing (displaying and/or viewing and/or playing) user(s) data.

In today’s technology driven world, people are using multiple computing devices for communication and other needs. The user captures, store and access the data like text-based, audio-based and visual-based data by using one or more electronic devices, such as separate devices for capturing and accessing text-based data, audio-based data and visual-based data (images and/or videos). Many conventional devices, such as those meant for capturing and accessing user data, incorporating display and input methods are either obsolete or lacks in capturing or storing or accessing different data types, or lacks features like security etc., or are inflexible or difficult to operate. This is unfortunate, as it may make the user to use such devices which may not completely fulfill their needs or user may use multiple devices to fulfill their needs and such inflexibilities are frustrating to most users. Accordingly, there are need of devices which are easy and flexible to use, to capture, store and access user data or different types of user data.

SUMMARY OF INVENTION

The above deficiencies and other problems associated with capturing, storing and accessing the user data can be reduced or eliminated by one or more of the below disclosed devices.

The aspects or the embodiments or the inventions relates to devices 100 comprising of one or more processor configured to execute computer readable codes (computer programs) and/or sets of instructions stored in at least one computer readable storage medium, where codes comprises of codes which facilitates the user to interact with devices 100 for either capturing, storing, listing, editing, transferring(cut/copy and paste), deleting, accessing (displaying, viewing and/or playing) or any suitable combination thereof, text -based user data or text-based user data with snapshots of graphical user interface (GUI) or audio based user data and visual-based user data (like images and/or videos etc.) or any suitable combination of text-based user data, audio-based user data and visual-based user data with or without snapshots of GUI and further comprise of codes for facilitating with one or more features and/or functions, wherein features may comprise of features including but not limited to features with respect to either security, convenience tools, device & system settings or any suitable combination thereof. The user data may comprise of logins and password details, notes, codes, security keys details, bank account details, credit card details, id proof details like social security details etc., important dates and their reminders (giving notification by visuals and/or vibrations and/or sound), professional data, dictations comprising reminders, memorable data, sensitive information, important dictations, notes and any other data which users wants to capture etc., visual data comprising of memorable data, important data, sensitive information, and any other text-based and audio-based and/or video-based data which user wants to capture and/or store and/or access. The invention is disclosed by the independent claims. Further, embodiments are disclosed as dependent claims.

In some aspects of embodiments, some embodiments comprise of one or more circuitry configured to facilitate in displaying the GUI through one or more display units. In some embodiments, GUI may be displayed by using one or more interal (permanent feature, which is part of device) display unit(s). In some embodiments GUI may be displayed by using one or more interal display unit and/or by connecting with one or more external display unit(s)

In some aspects of embodiments, some embodiments comprises of one or more circuitry configured to facilitate for taking user inputs, through one or more physical contact based input unit, where physical contact input unit may facilitate in taking input through physical contact with user where user input may be given such as by pressing, tapping, clicking, swiping, sliding, rolling, rotating etc. In different embodiments, one or more user input(s) may be given by using one or more interal physical input unit(s) and/or by connecting with one or more external physical input unit(s). The inputs given by using one or more physical input unit(s) may correspond to input for user data and/or other features and/or functions facilitated by devices. In some embodiments, some and/or all user input(s) given through one or more physical input unit(s) may be input for interacting with GUI such that one or more input(s) given by using said input unit(s) for interaction with GUI may or may not be the input for providing text, and if one or more inputs are given for text then it may or may not be the input(s) for text-based user data.

In some embodiments, some and/or all user input(s) given through one or more physical input unit(s) may correspond to interact with GUI such that one or more input(s) given by using said input unit(s) corresponding to interaction with GUI may or may not be for providing the text, and if the one or more inputs are not given for text then the input may be the input for navigating through and/or accessing and/or performing one or more facilitated features and/or functions. In some embodiments, some and/or all user inputs given through one or more physical input unit(s) may not correspond to interact with GUI. In some embodiments, some and/or all user inputs given through one or more physical input unit(s) may correspond to other features and functions facilitated by device which may include input for starting one or more interal process and/or activity, and/or input for interacting with one or more external devices.

In some aspects of embodiments, some embodiments may comprise of one or more circuitry configured to facilitate for sound -based output, given through one or more sound output mechanism. In different embodiments, sound-based output may be given by using one or more interal sound output device(s) and/or by connecting with external sound output device(s), to facilitate with one or more features and/or functions including but not limited to those disclosed in this document. In different embodiments given sound output may correspond to sound output of user data and/or sound output for one or more other features and/or functions facilitated by device. In some embodiments, some and/or all sound output may correspond to user data (which may be audio-based user data and/or video-based user data and/or text-based user data such that the text-based user data may be converted into speech by such as for example, by text synthesis analysis and/or modeling etc.

In some embodiments, some and/or all sound output may correspond to one or more of other features and functions facilitated by device which may include one or more of the system generated sound output and user generated based sound output like user input is given for any activity or process which may generate sound based output. In some aspects of embodiments, some embodiments may comprise of one or more circuitry configured to facilitate for connecting with one or more external device(s) for facilitating with one or more features and/or functions including but not limited to features and/or functions, disclosed in this document. In different embodiments, some and/or all interaction with external device may be with one or more storage devices, computing devices, input and/or output devices. In different embodiments, some and/or all interaction with external devices may correspond to user data and/or to one or more features facilitated by device 100. In some embodiments, interaction corresponding to user data with external devices may facilitate in transferring (cut/copy and paste) user data to and/or from external devices. In some embodiments, interaction corresponding to user data with one or more external devices may facilitate in either listing, editing, saving, deleting, cutting, copying, pasting, accessing, merging or any suitable combination thereof, compatible data and/or may facilitate with either accessing, editing

(name etc.), saving, deleting, cutting, copying, pasting or any suitable combination thereof, incompatible data transferred from external devices. In some embodiments, interaction corresponding to user data with one or more external devices may facilitate the user in printing printable user data. In some embodiments, interaction corresponding to user data with one or more external devices may facilitate user in capturing user data by using external devices. In some embodiments, interaction corresponding to user data with one or more external devices may facilitate in accessing stored user data and/or modifying stored user data and/or adding user data. In some embodiments, interactions corresponding to user data with one or more external devices may facilitate in transferring user data in external devices. In some embodiments, interactions corresponding to user data with one or more external devices may facilitate in printing user data. In some embodiments, interaction with one or more external devices may facilitate in updating device computing codes (computer programs comprising of kernel, middleware etc.). In some embodiments, some and/or all interaction with external devices corresponds to one or more of other features and/or functions facilitated by device.

In some aspects of embodiments, some embodiments may comprise of one or more circuitry configured to facilitate for sound-based input, which may be captured by using one or more sound input mechanism for facilitating with one or more features and/or functions including but not limited to features and/or functions, disclosed in this document.. In different embodiments, sound input may be captured by using one or more interal sound input devices and/or by connecting with external sound input devices. In different embodiments, some and/or all captured sound input may correspond to one or more type of user data (which may be for example, audio-based user data and/or video-based user data and/or text-based user data such that spoken content may convert into text for example, by using one or more of speech synthesis analysis and/or modeling etc., and/or for other features and/or functions facilitated by device. The embodiments, for which captured sound input corresponds to user data may further facilitate with either listing, accessing, editing, saving, deleting, moving (cut/ copy, paste) or any suitable combination thereof, of captured user data. In some embodiments, some and/or all sound based input may correspond to one or more of other features and/or functions facilitated by device. In some embodiments, some and/or all sound based input may correspond to interact with GUI which may include one or more process and/or activity for performing one or more feature and/or function.

In some aspects of embodiments, some embodiments may comprises of one or more circuitries configured to facilitate for visual -based inputs, which may be captured by using one or more visual capturing mechanism, which may comprise of one or more image sensor. In different embodiments, visual-based input may be captured by one or more interal visual input devices and/or external visual input devices to facilitate with one or more features and/or functions including but not limited to those disclosed in document. In different embodiments, some and/or all visual input may correspond to user data and/or for other features and/or functions facilitated by device. The embodiments, for which visual input corresponds to user data may further facilitate with either listing, accessing, editing, saving, deleting, cutting, copying, pasting, or any suitable combination thereof, of captured user data. In some embodiment, some and/or all visual input may correspond to visual input for one or more of the other features and/or functions facilitated by device. In some embodiments, some and/or all visual based input may correspond to interact with GUL

In some aspects of embodiments, some embodiments may comprise of one or more circuitries configured to facilitate for controlling one or more light source(s) to facilitate with one or more features and/or functions including but not limited to those disclosed in this document. In different embodiments, some and/or all light sources may be controlled for assisting in capturing user data and/or for other features and/or functions facilitated by device. The embodiments, for which light source is used for user data may or may not facilitate in activating/deactivating light source for assisting with its functions. In some embodiment, some and/or all controlling of light source corresponds to one or more of the other features and/or functions facilitated by device.

In some aspects of embodiments, some embodiments comprises of one or more circuitry configured to facilitate for taking inputs given by one or more additional sensors, to facilitate with one or more features and/or functions including but not limited to those disclosed in this document. In different embodiments, some and/or all inputs given by one or more of additional sensors may be the input corresponding to user data and/or input for other features and/or functions facilitated by device. In some embodiments, some and/or all inputs sensed by one or more sensors may correspond to user input for user data. The embodiments, for which input corresponds to user data may further facilitate with either listing, accessing, editing, saving, deleting, moving or any suitable combination thereof, of captured user data. In some embodiments, some and/or all inputs sensed by one or more sensor(s) may correspond to user input for interacting with GUL In some embodiments, some and/or all inputs given by one or more additional sensor may correspond to input based on the environmental condition of the sensor. In some embodiment, some and/or all input given by one or more additional sensor may correspond to input for one or more of the other features and/or functions facilitated by device.

In some aspects of embodiments, some embodiments may comprise of one or more circuitries configured to facilitate in transmitting input and/or controlling one or more biometric sensors for capturing biometric -print authentication and/or authorization by authentication for accessing one or more features and/or functions facilitated by device. In different embodiments, the biometric authentication method may be implemented by using one or more biometric sensor where the input may be given by methods comprising of input given through either by physical touch based input, sound based input, visual based input or any suitable combination thereof. In some embodiments, some and/or all authentication and/or authorization by authentication methods for accessing one or more features and/or functions may comprise of biometric authentication method. In some embodiments, some and/or all authentication and/or authorization by authentication methods for accessing one or more features and/or functions may not comprise of biometric authentication method.

In some aspects of embodiments, some embodiments may comprise of one or more circuitries configured to transmit signals for storing and/or retrieving cryptographic keys stored in one or more storage device to facilitate with hardware based encryption including but not limited to those disclosed in this document. In different embodiments, hardware based encryption may be implemented by using one or more interal hardware and/or by connecting with one or more external devices. In different embodiments, decryption may not require authentication of user and/or decryption may take place after user gets authenticated by one or more authentication method.

DESCRIPTION

Several aspects of invention are described below with reference to examples for illustration. Appearances of phrases like“in some embodiments”,“in other embodiments”,“in different embodiments” and similar language throughout this specification may, but do not necessarily, all refer to same embodiments of devices 100. In other instances, well-known methods, procedures, components, circuits have not been described in details as not to unnecessarily obscure aspects of embodiments. It will be further understood that the terms like“comprises',

“comprises', "comprising or alike when used in this specification, specify presence of stated features, integers, steps, operations, modules, elements and/or components, but do not preclude presence or addition of one or more other features, integers, steps, operations, elements, module, components, and/or groups thereof. As used in summary, description of invention and appended claims, singular forms“a”,“an” and

“the” are intended to include plural forms as well, unless context clearly indicates otherwise at relevant section. It will also be understood that term“and/or as used herein may refer to and encompasses any and all possible combinations of one or more of associated listed items. Further it should also be understood that“(s)” demotes possibility of plurality with respect to existence of that particular element and/or component and/or module and/or method etc. The embodiments described herein are demonstrated in form of non-limiting examples. The examples used herein are intended merely to facilitate an understanding of ways in which different form of embodiments herein may be practiced and not to define limitation. Further aspects of disclosure or terminology used to mentions physical components or elements etc. of devices 100 are in form of modules so as to give full understanding to reader about functional nature of that particular categorical and/or other further versions of component or element or module etc. Further, embodiments disclosed herein comprises of one or more processor which executes one or more computer programs comprising of codes stored in one or more non-transitory memory which when executed facilitates with functions and features as disclosed in this document. This should not considered as limitation for embodiments as other features may be employed by implementing their computing codes and/or categorical modules to give embodiments additional sets of features and functions. For example additional computing codes may be implemented to facilitate with additional functions or features like including a digital calculator or giving different user interface for highlighting particular feature etc. or different categorical module such as additional sensor module may be used to facilitate with different features, or different input and output methods can be used and so on. Accordingly, features and functions should not be construed as limiting scope of embodiments herein. Further this should also be understood that reference to computer programs, computing codes etc. should be understood as that they may encompass one or more computer programs or software for programmable processor or firmware such as, for example, programmable content of hardware device whether instruction for processor or a firmware or other programming codes can be updated with new versions through certain external devices like storage devices, computing devices etc. One skilled in relevant arts, will readily recognize that inventions can be practiced without one or more of specific details or components, or with other methods with respect to either hardware component and/or programming components etc., and thus should not consider this disclosure as a limiting scope for embodiments, as explanation is given in broader manner to demonstrate to those of ordinary skilled to understand easiest possible methods of implementations. Further in some embodiments, devices100 may be water resistant or water proof, and/or surface of devices 100 may offer resistance to dust. Further in some embodiments, devices 100 may comprise electromagnetic attack counter measures which may be implemented in any suitable form. For example, by using a computer program based method for generating random electromagnetic pulses and/or by using suitable material in the embodiments, implemented by any suitable means like lining etc..

Module is self-contained assembly of computing codes and/or electronic components & circuitry and/or mechanical parts, which facilitates to process assigned functions or tasks. Hence, modules can either be hardware, software or combination of both.

Circuitry comprises of system of circuits for performing a particular function in an electronic devices. The circuitry can be designed for executing a particular function or can be designed for executing multiple functions by using single or separate dedicated processing units.

Thus circuitries of modules may comprise various elements and/or components to process and execute one or more functions and/or features.

Following are basic components which devices 100 may comprise of:

The devices100 comprises of process and control module101, power module102, user interface module103, wherein process and control module 101 is configured to control and communicate with all other employed modules and further configured to execute one or more computer programs comprising of codes to facilitate with features and functions and execute corresponding user inputs; power module102 is configured to supply power to various components of devices 100; and user interface module103 is configured to facilitate with GUI and take corresponding user inputs.

Process & control module 101 facilitates in logically connecting, managing and controlling all hardwares and employed modules. This unit comprises of governing logic which defines functioning of devices 100. Process & control module101 comprises of processing module1011 and memory module1012. The processing module 1011 may comprise of single general-purpose processing unit or multiple processing units internally and/or separately, with each processing unit potentially being designed for a specific task and further configured to mn and execute programs and/or sets of instructions for facilitating with functions and features and further to execute corresponding user inputs. Memory module 1012 may comprise of volatile memory and/or nonvolatile memoryl013, such as one or more disk storage devices, flash memory devices, or other non-volatile solid state memory devices.

Power module102 is a module which facilitates in providing power to devices100. Power module102 comprises of power supply module1021 comprising of power supply circuitry 1022. In some embodiments, power module102 may use a power supply mediuml023 which can hold an electric charge like a rechargeable battery and/or solar cell etc. Further power module102 may comprise of a charging module1024 and a means 1025 to connect with external power sources like a port etc., which may be coupled to charging module. Charging module 1024 may be any module which may at least facilitate in charging power supply mediuml023. Power module102 may further facilitate in providing power through said meansl025, in case of battery failure. Power module may further comprise of circuitry for wireless charging. The wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, an electromagnetic method, or the like. An additional circuit for wireless charging, such as a coil loop, a resonant circuit, a rectifier, or the like. The battery gauge may measure, for example, a remaining capacity of the battery and a voltage, current or temperature thereof while the battery is charged. In some embodiments, power module may further facilitate in providing power to external devices which may be either through said meansl025 and/or by wireless function. In some embodiments, device 100 may use alternative power sources for example a solar cell etc. In some embodiments, power module103 may comprise of power supply circuitry 1022 and means to connect like a port, slot etc., to external power supply sources, like battery bank, ac power through adapter, other power sources like solar cells etc. In some embodiments, power module102 may comprise of power supply circuitry 1022 and means to connect with external power supply sources and may further comprise of provision for using a power supply mediuml023 like a battery, cell module etc which may not be charged by internal circuitry. In some embodiments, power module 102 may comprise of power supply circuitry 1022 and power supply mediuml023 for instance a battery, cell module etc to facilitate with power to the devices 100. Further in embodiments, power module may comprise of a power status indicator like a light source module for instance led etc.. The indicator may display a specific state of electronic device or a part thereof (e.g., the processor 210), such as a booting state, a charging state, or the like. Further power module may comprises of power related safety measures with respect to the devices100 safety like overcharging protection, power failure detection circuit etc.; and other suitable components for generation, management and distribution of power. In other embodiments, devices 100 may comprise of other power generation, distribution and management components and/or methods, now known or later developed.

User interface module103 is a module which facilitates user to interact with devices 100 via GUL User interface module103 comprises of display unit(s) 1031 and input unit(s) 1032. In the embodiments, display unit(s) are configured to display GUI and input units are configured to take user input for interacting with GUL In some embodiments, display unitl031 and input unitl032 may be single unit. In some embodiments, display unitl031 and input unitl032 may be separate units. In some embodiments, display unitl031 and input unitl 032 may be single unit and/or separate units and may further have additional display unit(s) and/or additional input unit(s). For example, user interface module 103 comprising of display unitl031 and input unitl032 as single unit or separate units, along with additional display unitl033 and input unitl034 as single unit or separate units, which may further have one or more additional display unit(s)1035 and/or input unit(s)1036 as single and/or separate units. The devices 100 may use any suitable display method and/or unit for displaying GUI and any suitable input unit method and/or unit for taking user input. In some embodiments, device 100 may comprise of means to connect with external display and input devices, or may use only internal or only external or any suitable combination of one or more display unit and one or more input unit, to facilitate with its one or more functions and/or features.

EXAMPLE OF DEVICE

The devices 100 may comprise of various modules (components) to facilitate with several functions and features. A module may facilitate devices 100 to provide a particular function and/or a feature alone and/or in conjunction with other modules.

Following are few examples of modules which devices 100 may comprise in any suitable combination to facilitate with one or more functions and/or features

Additional input interface module104 is a module which facilitates with additional input interface1041 for providing user inputs, by means of physical contact. Additional input interface module104 comprise of additional input interface1041 and additional input interface medium1042, which may facilitate user in providing input for one or more functions and/or features. Additional input interface1041 and additional input interface medium1042 may be implemented in electronic and/or mechanical form, electro-mechanical form, for example, slider switches, push buttons, dials, rocker buttons, electronic based haptic feedback mechanism, any other sensor interfaced with mechanical based mechanism like push mechanism, rotating mechanism, sliding mechanism like slider Switches etc. The devices 100 may comprise one or more additional input interface module104 comprising of one or more additional input interface1041. The additional input interface module104 may facilitate in giving user input to device 100 for performing one or more functions and/or features. Further, the additional input interface module 104 may be function and/or feature specific and/or may facilitate in performing multiple functions and/or features, such that one or more additional input interface module may facilitate in performing one or more feature and/or function, alone and/or in combination with other additional input interface module104. For example, like if any one or more push button mechanism or alike (variation of implementation), facilitates in providing user input for powering on/off or resetting devices 100 alone and/or in conjunction with other modules, then some embodiments may also facilitate to use that mechanism alone and/or in conjunction with one or more of other additional input interface module!04, for providing user input to perform one or more features and/or functions. In another example of its application, devices 100 may facilitate to use additional device interface module 104 to perform one or more tasks in GUI like locking GUI, adjusting display brightness, scrolling/navigating through GUI for instance, for selecting and/or activating one or more function and/or feature or to navigate through and selecting and viewing one or more stored data type etc. Further, devices 100 may facilitate to use additional input interface module104 in conjunction with other different categorical modules to facilitate with different functions and/or features, for example, user input given through one or more additional input interface1041 may facilitate in performing one or more activity or process like launching or activating particular feature and/or function and/or as an input for performing any activity with respect to any other module , such that additional input interface module 104 may work as a type of short-key (shortcut), for instance in some embodiments, user input may be given to launch and/or activate and/or deactivate one or more functions and/or features, like of camera module!09, sound input module 106, light source module111 etc., such that by giving one or more inputs which may or may not be combinational in nature alone and/or with one or more other additional input interface module104, may launch and/or activate-deactivate function for instance, of microphone, for say audio recording, which may either display GUI or may perform activity in background as per selected option from settings of feature, although in some embodiments one or more of such features and/or functions may or may not be selectable and/or may be system-enforced. In another example of its application additional input interface module 104 may be used to give user input, for performing functions with respect to display module, like turning on/off the display lights. The settings and policies of features facilitated by devices 100 by using additional input interface modules 104 may be accessed via GUL

External device interface module105 is a module which facilitates with an interface1051 to connect with external devices. The devices 100 may comprise of at least one external device interface module105, comprising of at least one external device interface. The devices 100 may comprise of any suitable external device interface module105 to facilitate with one or more functions and/or features. Example of external device interface modules105 are Universal Serial Bus (USB) module like usb, micro usb etc., FIREWIRE, HDMI, SD card module, memory card module, auxiliary modules etc. The external device interface module105 may facilitate device 100 in connecting with one or more input devices and/or output devices and/or input-output devices like display devices, printing devices, storage devices, sound output devices, sound input devices, visual capturing devices like webcam etc, external computing devices etc. Further devices 100 may comprise multiple external device interface modules105 which may facilitate with one or more similar nature of functions and/or features. The external device interface modules105 may be function specific or feature specific or may facilitate in performing one or more functions and/or features. The external device interface module105 may facilitate the devices100 to connect with external devices for capturing audio based input and/or visuals based input. The external device interface module105 which facilitates the device100 to access external storage devices may further extend device functionality to store data on external devices, and to be used as a storage medium. Further, devices100 may facilitate to use external device interface module for copying the text-based data in external storage devices 100 in other computer program readable formats such as like xls, pdf etc and may further facilitate to encrypt and password protect external storage devices. Further devices 100 may use external device interface module105 to connect with external output devices like display device, sound output devices, printing devices etc. The devices 100 may further facilitate to use external device interface module105 to connect with external input devices like keyboard, mouse, sound input devices etc. The devices100 may further facilitate to connect with compatible external printing devices 100 to print user data like text, images etc. Further devices 100 may facilitate to temporarily disconnect/restrict and/or disable communication with certain external devices100. The devices 100 may further facilitate to change or upgrade device 100 computer programs of devices100 by connecting to certain external devices like storage devices100, computing device etc. through external device interface module105. The devices 100 may facilitate to use external device interface module105 in conjunction with other modules to facilitate with functions and features. For example, usb module coupled to, say for example with charging module, which can further be used to communicate with external devices. The usb module thus works in conjunction with charging module to facilitate with multiple functions, i.e. of both charging and accessing external devices. The usb interface can be used to connect to input devices like a keyboard, mouse etc., webcam-mic set etc., to capture data etc. Further usb interface used can be used to connect with output devices100 like external display unit, speaker etc. Further, usb interface can also be used to connect with devices, like computing device, a laptop, smart display and/or input devices, printers etc. The usb interface can also be used for transferring the data to and/or from external storage devices like video, pictures, audio, text data etc. Another example of external device interface module105 is a memory card module which can also be used to transfer data. Thus, multiple external device interface modules105, usb module and memory card module, are facilitating in transferring the data. Another example is audio jack which facilitates in connecting with external sound input and sound output devices, for instance a headphone with mic or external speakers etc. Another example is in which additional input interface module 104 may be used to provide user inputs for adjusting sound of connected speaker or capture image or a video through external webcam etc., thereby working in conjunction with external device interface module. The settings and policies of the features facilitated by devices 100 by using external device interface module105 may be accessed via GUI. Another example is of using

Sound input module106 is a module which facilitates in capturing the sound. It is more like sound input mechanism of device 100, which facilitates in capturing sound-based input. The devices 100 may comprise of at least one sound input module105 comprising of at least one sound input medium1051 to facilitate with one or more features and/or functions. The sound input mechanism can be implemented by any suitable method/means which may facilitate in capturing sound based input, such as by using interal mechanism, for example, comprising of sound input medium 1051 like one or more acoustic sensors like a microphone etc., and/or by connecting sound input devices through any means, for example, through audio jack which may facilitate in capturing sound based input by connecting with external sound input mechanism like a headset etc. The devices 100 may comprise of at least one sound input module106 comprising of at least one sound input medium The sound input module 106 may facilitate device 100 to capture sound-based user input. For example, the sound-based user input provided for capturing data, may be recognized as a command by process and control module through list of voice commands for capturing the user data, where the user data such that the spoken content may be get converted into text based data by using text synthesis analysis and/or modeling etc. Another example of application of sound input module 106 is that it may facilitate in capturing audio data, for instance audio data like notes, voice memo and so on. In another example of its application, devices100 may use sound input module106 for facilitating in capturing sound-based input for authentication and/or authorization protocols by authenticating user like by phrase recognition. In some embodiments, devices 100 may use sound input mechanism in conjunction with biometric sensor module111 for capturing biometric prints, for example, voice print for voice recognition to authorize/authenticate by voiceprint - text dependent and/or text independent etc. In some embodiments, biometric sensor module110 may comprise of one or more separate acoustic sensors for sound authentication biometric system.

In another example of application, the sound-based user input may be the user input for interacting with voice assistant. In another example of its application devices100 may use sound input module106 in conjunction with other employed modules to facilitate with functions and/or features, for instance sound input module106 may be used in conjunction with camera module109 to capture sound of videos and/or launch or activate functions of camera module109 etc. In another example, devices100 may facilitate in using additional device interface module104 to launch and/or activate function of sound input module106 for taking sound-based user input, wherein sound-based user input may correspond to any features and/or user data. The settings and policies of features facilitated by devices 100 by using sound input module may be accessed via GUL

Sound output module1 07 is a module which facilitates in providing sound-based output. Sound output module 107 is more like sound output mechanism of devices 100, which facilitates in providing sound-based output. The devices 100 may comprise of at least one sound output module 107 comprising of at least one sound output mediuml071 to facilitate with one or more features and/or functions. The same may be implemented by any suitable method/means which may facilitate in giving sound based output such as by using internal mechanism for example, comprising of sound output mediuml071 for instance, like a speaker module etc., and/or by connecting with external sound output device, for example, through audio jack which may facilitate in capturing sound based output by connecting with external sound input mechanism like a headset etc. The devices 100 may comprise of at least one sound output module 107 comprising of at least one sound output mediuml071. The sound output module 107 may facilitate devices100 in giving sound-based output. For example, devices 100 may use sound output module1O? to give sound output of stored data, for instance a content reader reading text displayed in GUI comprising of text data using text to speech synthesis. Another example is where sound output is given for stored sound-based data like audio-based and/or videobased data, such as when user input may be given for generation of sound-based output. In another application, sound output may be given as system generated sound output such as like for operations performed in device for instance in GUI like sound of the screen lock, alerts, notifications, low battery signal, synthesized voice-based feedback or when a user input is received etc. Further devices100 may use sound output module 107 in conjunction with other modules like devices100 may use additional input interface module104 to control sound output level of sound output mediuml071. Further devices100 may facilitate in controlling sound output level via GUI. The settings and policies of features facilitated by devices 100 by using sound output module 107 may be accessed via graphical user interface.

Additional sensor module108 is a module comprising of at least one sensor which gives relative input about conditions for which it is employed. The input may be for example, with respect to measuring physical quantity and/or detection of an operation state and/or environmental state etc., with respect conditions which may be converted into input signal. The input received from additional sensor module may facilitate in performing one or more function and/or feature. The devices100 may comprise of at least one additional sensor module108, comprising of at least one sensor. For a particular function or a feature, devices 100 may use sensor alone or in conjunction with one or more other sensor. The additional sensor module may include, for example, proximity sensor, illumination sensor, ultra violet sensor, motion sensor for instance, gyro sensor, barometric pressure sensor, magnetic sensor, acceleration sensor, gesture sensor, grip sensor or any other different type of sensor to facilitate with one or more feature or a function. Further additional sensor modules 108 may facilitate devices100 to take user(s) input, for example, gesture-based inputs like tilting gesture(s), sliding gesture(s), shaking gesture(s), tapping gesture(s) etc. Further devices 100 may use additional sensor module(s) in conjunction with other modules to facilitate with one or more functions and/or features.

For example additional sensors like motions sensors, proximity sensors, illumination sensor etc. may be incorporated in devices100 to add functionalities and features like for measuring conditions such as like automatically adjusting screen brightness, turning on lights of display, for taking user input such as like, changing orientation of GUI with respect to orientation, turning on lights of display, launching camera function, magnifying displayed GUI, launching torch function of light source module 111, launching one or more features and/or functions of device like audio player etc. The settings of features facilitated by devices100 by using additional sensor moduleslO? may be accessed by graphical user interface.

Camera module1 09 is a module which facilitates in capturing visual-based input. It is more like visual capturing mechanism of device 100, which facilitates in capturing visual based input. The devices 100 may comprise of at least one camera module 105 comprising of at least one camera to facilitate with one or more features and/or functions. The visual capturing mechanism can be implemented by any suitable method/means which may facilitate in capturing visual based input, such as by using internal mechanism, for example, comprising of one or more image sensor 1092 and/or by connecting external visual input devices through any means, for example, through USB which may facilitate in capturing visual based input by connecting with external visual input mechanism like for instance a webcam etc. Further, camera module109 may have a normal camera configuration, infrared camera configuration, 3D camera configuration or any other configuration which may facilitate in capturing required visual data. The camera module!09 may facilitate devices 100 to perform various functions and features For example, devices 100 may use camera module to capture visual based input comprising of visual -based user data like images and/or videos etc. Another example of application of camera module 109 is in capturing the visual input provided by the user with respect to device operations for instance like a text or input for performing any activity etc. Another example of its application is in which devices100 may use camera in capturing visual based input in different view modes like normal view, wide angle view, depth-based view, zoom based view and so on.

Further devices100 may facilitate with other features related to camera view like auto-focus, image stabilization, visual resolutions, filters, other visual enhancement & editing tools etc. In another example of its application, devices100 may use camera module in conjunction with other employed modules to perform a specific function or a feature like camera module in conjunction with sound input module to capture audio of visual data like a video. Another example is in which devices100 may use additional input interface module in conjunction with camera module to perform functions like capturing images or videos, zoom in and zoom out function etc. In another application the devices

100 may take visual based input for biometric authentication and/or authorizing by authenticating user like by facial recognition etc. In another example, devices100 may use additional device interface module104 to launch camera functions. Another example of camera module application is in which devices100 may facilitate in using camera to digitally magnify objects in camera view, such as like digital magnifier.

The settings and policies of features facilitated by devices100 by using camera module, may be accessed via GUI. Biometric sensor modules110 are modules which facilitates in authenticating the user(s) through their biometric print. Biometric print may comprise of one or more pattern of one or more characteristic of biometric component. Biometric sensor module110 may comprise of application modules and hardware modules. The devices 100 may comprise of at least one biometric sensor module110 comprising of at least one biometric input capturing components. The devices 100 may comprise of any biometric sensor module110 to authenticate user by their biometric prints. For example, sets of instructions executed by one or more processor may have instructions for determining and/or recognizing whether one or more input captured as biometric components are responsive to biometric components for user and if biometric components are in accordance to that to biometric component for user then one or more inputs captured may be stored as biometric print or user may get notification for giving more inputs for capturing biometric print. To authenticate user, if input provided by user are in accordance to stored biometric prints, then user may get authenticated. If captured inputs are not responsive to that of the biometric print, user may be notified by alert (visual, vibrational, sound or any combination). For example, finger print based authentication methods in which at least one pattern of at least characteristic of at least one finger or thumb like ridges, valley etc., is used to authenticate the user. Further, fingerprint module hardware component may be a separate module integrated in devices100 or may be module embedded in user interface, such that it may or may not be visible through naked eyes. Another example is content recognition methods in which at least one patter of at least one characteristic of hand written or drawn content of at least one sample like signatures etc. is saved as a template to authenticate the user. Another example is voice-based authentication methods where number of voice samples of users are used for determining vocal box characteristics for voiceprint. The same may further be implemented along with phrase recognition. Another example is visual based authentication methods, where the number of visual samples may be provided by the user for determining characteristics of visual biometric template for instance, facial recognition authentication methods where at least one patter of at least one facial characteristic like shape based where spatial variation of features is used, and/or texture based where spatial variations of skin texture and features may be used, may be used to authenticate the user; eye scanning authentication methods where at least one pattern of one characteristic of at least one eye is used to authenticate user like retina, iris, etc., for instance one or more iris features like coronas, crypts, etc. may be captured and segmented for extracting the iris code and/or extracting the pattern of blood vessels like that in retina scanning can be used in eye scanning module or combination of two different technologies or any other suitable means. In some embodiments, to authenticate the user by their biometric prints, devices 100 may use either an existing employed hardware module(s) like microphone, camera etc., and/or a separate biometric hardware module(s). For example in some embodiments, for voice based biometric authentication method, devices100 may facilitate to use an existing sound input module which may be used for capturing audio data or may facilitate with separate biometric sensor module110 to capture biometric print. Similarly, one or more of image sensors used for authentication can be used for facial recognition. In some embodiment, external hardware biometric module(s) may be used to authenticate user. From the above it can be seen that the biometric module can be a physical-contact based module or non-physical- contact based module. Similarly, it can be sound input-based module or visual based input module or physical contact-based module. The settings and policies of features facilitated by devices100 with respect to biometric sensor modules, can be accessed through GUI.

Light source module111 is a module which facilitates with an additional light source. The devices100 may comprise of at least one light source module111, comprising of at least one light source. The light source module111 may facilitate devices100 with various functions and features. For example, devices100 may use light source module111 like light emitting diode (LED) module to add a function of torch or flash light. Further light source module 111 may also facilitate devices100 to indicate functioning. For example, power status indicator of device100.

The devices100 may further facilitate in using light source module111 in conjunction with other modules to facilitate with certain functions and features. For example, LED module may be employed to work as a flash light during the camera operations. Devices 100 may use multiple light sources to facilitate with a single function or a feature. For example, an employed LED module may be used as a torch and as a flash light for camera. In another example, devices100 may use additional input interface module 104 to activate light source functions like torch.

The settings and policies of features provided by devices100 with respect to light source module111, can be accessed in device and system settings through GUL

Hardware encryption module112 is a module which facilitates with the hardware-based encryption. The module112 may comprise of a coprocessor to assist the device with cryptographic protocols. For example, the module112 may include a storage space that is higher in security level than the memory 1012, may secure data storage and protected execution circumstances. The Hardware encryption module may be implemented with an additional circuit and may include an additional processor. Devices 100 comprise of at least one hardware encryption module to facilitate with hardware based encryption. Hardware encryption module112 may store the security token in a separate memory which may be separate interal memory, for example, like interal secure element or trusted party module (TPM) etc. In some embodiments, security token may also be stored in external memory devices, such as those like usb device or memory card devices etc., such that decryption may only take place when said device is connected to devices 100.

Further, some embodiments comprise of mechanism which may facilitate in giving input for basic functioning like powering on/off, resetting etc. The same may be implemented by any suitable method, for example mechanism like additional input interface module104 such as a push mechanism or any suitable mechanism etc. may be used, and/ or may be facilitated by any of user interface module 103 such as by using infrared port or any other suitable method etc., or by any suitable means, such that mechanism facilitates in performing one or more of the said processes and/or activities. Some embodiments may auto power on/off based on remaining charge and/or power supply conditions, and/or may have pre-determined and/or may facilitate in taking user defined instructions which may or may not be event -based instructions, for performing one or more of the said processes and/or activities.

The devices 100 may use all of the above or any suitable combination of categorical or sub categorical module to facilitate with different features and functions. From the above it can be understood that the user interface for interaction with the devices 100 can be implemented by using different types of modules. Further the components explained above are interdependent on each other for facilitating with one or more then function and/or features.

Examples of Implementations:

Following is an example of the devices 100, which demonstrates one of the several embodiments with scope of improvement for conventional user data storage devices.

The devices 100 executes computer programs comprising sets of instructions or codes, which facilitates in displaying GUI and taking the user input based on functionality of device 100. The sets of instructions may further comprise of instructions for some features, which may be features related to security, convince tools and device usability and operations. The exampled embodiments of device 100 disclosed herein facilitates in capturing, editing, listing, moving, transferring, accessing and displaying text-based data, visual based data and audio-based data and further facilitates in capturing the snapshots of GUL

In following examples, device 100 may comprise of memory1, a memory controller2, processing units3, audio circuitry4, speaker5, microphone6, audio jack7, motor8, camera9 taking images and videos, LED10, power status incicatorl 1, rechargeable battery14, biometric module 110 comprises of camera system12 for facial recognition, fingerprint sensor13, micro usb15, additional sensor module108 may include, for example, at least one of a gesture sensor15, a gyro sensor16, abarometric pressure sensor17, a magnetic sensor18, an acceleration sensor19, a proximity sensor20, a color sensor21 (e.g., a red/green/blue (RGB) sensor), a grip sensor22, haptic sensor23, memory card module24 and other input and/or control devices. Further for simplicity of understanding, a touch-sensitive display module (Touch screen unit25), which constitutes as single unit, along with push buttons 26 (additional input interface module108) are exampled for physical input units in these embodiments. These components may communicate over one or more buses or signal lines. Further components may be disposed on circuit board. In some embodiments, devices 100 may have a flexible circuit which may have some flexibility, for example, like on flexible sheet or board etc. In some embodiments, display module may be flexible such as like glass, panel may have flexibility. Further, some embodiments may be embodied as wearable electronic devices which may be adaptable and/or configured to be worn, like that of electronic devices which can be worn on wrist or arms or waist or neck or at some other place. Some embodiments may be embodied as non-wearable electronic devices which may have form factors for instance like that of portable electronic non-wearable devices for instance, a form factor which can be placed at any surface and/or hanged and/or mountable and/or pocketable etc.

It should be appreciated that devices 100 is only example of single type of embodiments, and that devices 100 may have more or fewer components than shown, may combine two or more components, or may have different configuration or arrangement of components. For example, devices 100 can have different physical user interface elements which may or may not include touch sensitive display, or may include one or more of other physical user interface devices, such as a physical keyboard, sliding switches, dials, infrared port, click wheel, pointing device like stylus etc., touch pad, separate and/or single external I/O ports for display and/or input device, joystick, infrared port, multiple screen based display units (both touch or one touch and one non-touch), mini projection unit etc. or may only have sound based input like through which user may interact by voice command along with one or more physical input mechanism like rotating mechanism, push button etc. may be used depending upon type of embodiments.

In exemplary embodiments, process and control unit26 may comprise of processing unitslO comprising of one or more processors configured to execute computer programs and/or sets of instructions and execute corresponding user inputs. The processing unto may further include a graphic processing unit (GPU) and/or an image signal processor (ISP). The embedded secure element is installed on chip for encryption

Further process and control unit comprises of memoryl (which may include one or more computer readable storage mediums). In exemplary embodiments, default memory 1 for executable codes and user data comprise of internal memory. Although in some embodiments, it may be one or more internal and/or external memory. Process and control unit? may use memory controller2 to manage access to memoryl (In some embodiments, access to memory for one or more processing units may be direct or may be implemented by other mechanisms). Access to memory1 by other components such as processing units and other input output( I/O) devices may be controlled by memory controller2. Process and control unit may further comprise of circuitry or say an interface like an I/O interface, through which I/O devices may be controlled, say by using one or more controllers. Further process and control unit comprises of circuit for controlling other components like an input/output

(I/O) interfaces. The I/O interfaces couples input and output devices to processing unit and memory. The I/O interfaces may use controllers to control input/output devices. The controllers may receive/send electrical signals from/to I/O devices, although in some embodiments arrangement may be different, such as controller11 used for controlling memory may be used to control one or more of other I/O devices.

Access to memory by other components such as processing unitslO, I/O interfaces etc. is controlled by memory controller. In some embodiments, I/O interface, processing unitlO, and memory controller11 can be implemented on a single chip. In some embodiments, I/O interface, processing unitlO, and memory controller11 can be implemented on separate chip. Touch screen display unit9 (touch-sensitive display) comprises of touch-sensitive surface and sensor(s) and/or set of sensor(s). Display controller may be used to receive and/or send electrical signals from/to touch screen. Touch screen unit9 is configured to display GUI and take user input based on haptic and tactile contact.

The touch screen9 and display controller (along with any associated modules and/or sets of instructions in memory) may detect interaction or contact like any movement or breaking etc. on touch screen, and converts detected contact into interaction with displayed user-interface objects

(e.g., one or more soft keys, icons, symbols, images etc.). In an exemplary embodiment, a point of contact between a touch screen 112 and user corresponds to finger of user. In exemplary embodiments, touch screen may use LPD (light emitting polymer display) technology with capacitive touch sensing technologies to display the GUI and detect contact (although in some embodiments, devices 100 may use other display and touch sensing technologies to display GUI and/or to determine point of contact, which such as like contact, movements, breaking etc., which may be now known or later developed, including but not limited to capacitive, resistive, infrared, and Surface acoustic wave technologies, proximity sensor arrays, LCD (liquid crystal display) technology, a light-emitting diode (LED) display, an organic light-emitting diode (OLED) display, a microelectromechanical systems (MEMS) display, or an electronic paper display etc.) The display 160 may present various content (e.g., a text, an image, a video, an icon, a symbol, or the like) to the user. The display 160 may include a touch screen, and may receive a touch, gesture, proximity or hovering input from an electronic pen or apart of a body of the user. Further, power module comprises of charging module and rechargeable batteries for supplying power to various components. The charging module is coupled to micro-usb port and facilitate to charge by both cable based and wireless charging. In some embodiments, device 100 may also facilitate in supplying power to external device. In other embodiments, there might be any other suitable method for power supply and management.

Further, an sd card The storage options for some data types like video etc., are selectable which can be accessed from GUI. In some embodiments, user data and executable codes may be stored in same memory. In some embodiments user data and executable codes may be stored in separate memory. In some embodiments, either or both user data and executable codes may be stored in external memory. Further, input/output interface controller controls micro-usb for connecting with external input and output devices. In some embodiments, device 100 may further have memory card to transfer data to and/or from external devices. Further the exampled embodiment comprises of microphones and speakers, interfaced with audio circuitry. The audio circuitry4, speakers and microphonefi provide audio interface between user and device

100. Speakers emits human-audible sound waves corresponding to audio data transmitted from audio circuitry4. Microphone convert sound waves and transmits captured data in form of an electrical signal to audio circuitry4 for further processing. The audio circuitry4 converts signals to audio data and transmits audio data to I/O interface for processing. The audio circuitry4 may comprise of one or more signal converter. The signal sent may be used to save audio data and/or for processing other features. The audio-based user-data may be retrieved and/or transmitted to memory by I/O interface. Further, in some embodiments, audio circuitry4 is interfaced with an audio jack7 to facilitate in connecting with external devices like a headset etc (in some embodiments, embodiments may or may not comprise of an audio jack?, but may comprise of universal serial bus(USB) or any other sound I/O interface which may facilitate in connecting with external sound I/O devices). The speakers are used to give sound output comprising of system generated alerts, user based alerts, notifications, activities performed in GUI, user data etc., where user data comprises of audio-based, video-based, text-based user data. In some embodiments, devices

100 may comprise of content reader as well, which may for instance read contents displayed in GUI like, screen of exemplary embodiment, or may read content based on provided user input such as that of selected UI object by giving user input. To read the displayed content, memory may for example comprise of sets of instructions which when executed may read content say by one or more text analysis and/or modeling method. The same is also used for speaking the content of text based data. The microphonesti are used to capture sound based input comprising of inputs pertaining to user data and with respect to other features such as, for example, voice commands, speech to text synthesis and authentication protocols viz., recognition based user authentication- text dependent and independent, although in some embodiments device 100 may or may not have voice recognition based authentication, and/or may have voice-based text-dependent authentication, where voice print of user may not be generated for recognizing authorized user. When user may request for activation of any such feature, user may be presented with prompt on display for generating voiceprint, where one or more sample sound based input given by user may be used for generating voice print The samples can be analyzed and/or modeled based on unique information about user’s vocal tract and behavior of user’s speaking patterns. For example, statistical models of the characteristics of spectral features present in a user’s pronunciation of various phonemes can be built to distinguish voice characteristics of different user’ s voices. For example, Vector Quantization (V Q) codebook-based techniques can be employed to generate a voiceprint. Ergodic-HMM-based methods that analyze stochastic Marchovian transitions between states to build learned models of voice characteristics such as voicing, silence, stop burst, nasal/liquid, frication, etc., can be used to generate a voiceprint, for example. In some implementations, a two-pass speaker recognition approach can be used that first explicitly determines phonemes or phoneme classes from the audio data from a speech input and then performs speaker verification by a weighted combination of matches for each recognized phoneme category. For voice-based authentication speaker recognition analysis can be implemented. For example, the voice characteristics of voice in speech input can be compared to voice characteristics of a voiceprint of an authorized user. If voice gets matched to voiceprint, user can be authenticated as an authorized user. If voice cannot be matched to voiceprint, user will not be authenticated as an authorized user and an error message is prompted (e.g., audibly and/or visually and/or vibration) to user. The speech to text feature can be implemented by speech synthesis analysis and/or modeling. For example, Hidden Markov modeling (HMM), dynamic time warping (DTW), etc.) can be performed on the speech input to generate text that represents the content of the speech input. The voice command feature can be implemented by analyzing the text generated from the speech input to determine a command to invoke a feature of device 100.

For example, the speech input can be translated into text using speech-to-text processing and the text can be analyzed to identify a command using speech recognition processing. For example, once the speech input is translated into text, the text of the speech input can be compared to text associated with commands known to device 100 to determine if any of the speech input text corresponds (e.g., matches) to the command text. If a textual correspondence is found, in whole or in part, in the speech input, device 100 can execute the command corresponding to the command text that corresponds to the speech input text. In some embodiments, multi-lingual method is implemented. Further, two cameras are their a camera9 and camera systeml2. The camera9 may be used as a picture camera for images and videos. The cameras comprises of image sensor(s) which receives light from environment, projected through one or more optics, and converts light to data representing an image which is then transmitted to combination processors for further processing. Image sensor(s) can include charge-coupled device (CCD), complementary metal-oxide semiconductor (CMOS) phototransistors or any other to capture visuals. The image sensor are positioned and housed such that touch screen display may be used as a viewfinder. In different embodiments, position and/or location of Image sensor(s) and/or housing may be fixed and/or may be movable by using suitable mechanism like rotating, sliding etc, for instance by changing position and/or location of housing of image sensor(s), optics etc. Similarly, in different embodiments, image sensor(s) may be placed in front of device and/or at back and/or at any side depending upon the embodiment, for instance, if embodiment is embodied as a wearable device like embodied to be worn like on wrist, then camera may be placed on top or at any side, such that camera can capture visual based data, such as appearing on display and/or at any side, together and/or separately. Similarly, if it is in some other form like that of portable device, then it may be placed such that it may be convenient for usage., like on any side depending upon the embodied form and/or structure. The operations of cameras may be controlled by using any common controller or separate controller, to say- a camera controller may be used. Further camera systeml2 comprises of an IR (infrared) light source that camera controller may controls to transmit relatively short IR light pulses to illuminate a scene in field of view of camera. For each transmitted light pulse, camera controller may shutter camera ON and OFF for a corresponding short exposure period responsive to a transmission time of the IR light pulse to image light that features reflect from the light pulse back to camera.

Controller determines round trip times and distances to features responsive to the imaged light. The same facilitates in capturing visuals in biometric authentication viz, facial recognition and iris based authentication. In some embodiments, same camera may also facilitate in capturing visual-based user data. Further different embodiments may have different types of cameras and methods of implementation

(depending upon type of embodiment), for instance instead of using IR light a laser light may be used to determine the round trip time and distance of object or a depth based camera may be implemented which may implement depth of the target object by illuminating object with controlled patters of dots, using infrared light or LED etc. Similarly, the camera systeml2 configuration and range may be like that of a may be a normal camera or 3D camera or TOF-3D camera. In some embodiments, one or more different authentication methods can be combined to work as a single authentication protocol, for instance one or more patter of eye recognition like iris or retina etc., and facial recognition together may constitute as a facial print or voice based authentication can be implemented together to authenticate the user. For biometric authentication, biometric sensors may use any of the approach and combination of suitable computing codes and/or methods for one or more detection and/or recognition methods which may facilitate in determining the biometric prints. The memory conroller2 may, for example, comprise set of instructions for biometric authentication to determine biometric -print of user. The codes may comprise at least one classifier that is trained to recognize biometric components and/or characteristics by training detection system to distinguish the biometric components and/or characteristics by using suitable training sets, some of which may contain biometric components and/or characteristics and some of which may not. Optionally, the detection algorithm is global detection system trained to detect biometric component and/or characteristics as whole from provided biometric sample. Optionally, detection algorithm is component based detection system which determines presence of the biometric characteristics by providing assessments provided by component classifiers as to whether characteristics of biometric components are present in the biometric sample. The assessments are then combined by holistic classifier to provide a holistic assessment as to whether the sample provided by the user evidence the biometric template. The component and a holistic classifiers may be trained on a suitable training sets. Two or more of such patter recognition algorithms may be combined together to form a single authentication method.

Further input/output interface may also comprise of one or more controllers to manage operations of plurality of push buttons26 which may facilitates in powering on/off, resetting, providing input for performing features with respect to other employed units, like adjusting the sound output, capturing the audio and visual based data, launching microphone function, adjusting display lights, launching camera and torch. A quick press of one or more of the pushbuttons26 may disengage a feature or multiple combinational pressed may activate one or more of such feature. The selection of shortcuts may be made through settings menu. Further quick press on the pushbutton may unlock touch screen or to begin a process such as the process may comprise of authorization to unlock device. For example, a touch screen unlock may happen by facial recognition after a push of button, such that the user may not have to give a separate input for unlocking. In some embodiments, the embodiments may automatically unlock screen when user may come in view of camera systeml2 to facilitate with user convenience, for example in wearable devices which can be worn on wrist or any other embodiment. Further device 100 comprises of flash light (LED) 10 which may be used with cameras for capturing visual based user data. The input/output interface may use camera controller to control operations of LED10 or may use a separate controller. In different embodiments, plurality of light source may be used and may be placed as per embodied design considerations. In some embodiments, device 100 may not comprise of flash light. Further in embodiments, an LED10 is placed in front side of user interface to determine power operations of device. The sensor module 108 may be used to provide for automatic display light adjustment, gesture based like shaking gestures, tapping gestures, pressing and tilting gestures. For example for changing orientation of device and/or display may change orientation of GUI., turing on lights of display with particular gesture, launching the camera function, magnifying display, launching torch function, launching features like media player. In different embodiments features with respect to sensor may be provided more or less based on type of embodiment and further the features may also be changed such as like using an external input device having sensors etc. for giving input. To facilitate with features memory controlled may for example comprises of sets of executable instructions for detecting and/or measuring change of state with respect to orientation angle, tilt angle, and force applied in response to user input by determining change of state. To determine change of state, detection algorithm may be implemented by using one or more classifier to determine states to which new observations belong, where identity of states are known states. The classifier is trained prior to implementation on suitable training sets corresponding to known states. The classifier may include plurality of components, wherein each component is configured to resolve a particular collection of inputs to predict the state. A component for predicting movement of device to one or any of the combination of“x, y, z” axis may be configured to resolve sensor data, which may be including acceleration and/or velocity data, and position data with associated time data. The classifier component for detecting a change in state can be defined using a predetermined decision tree acting from acceleration sensor inputs which may also include velocity sensor input, optionally conditioned by

Markov model or any other suitable model for potential improvement in accuracy. Optionally, velocity data, derived can be used to confirm change in state, for example, by determining change in velocity or small apparent velocity, latter accounting for any error in velocity determination or may be used to take as an input for user gesture. The classifier component for predicting change in state is configured to resolve sensor data including device acceleration data, device position data, and device velocity data. Similarly, illumination sensor can be used to measure and/or determine change in intensity of light and vicinity of device for automatically adjusting screen brightness and/or close display lights.

GUI & FEATURES The computer programs and/or sets of instructions stored in memory of device 100 comprise of instructions which when executed by one or more processor facilitates in displaying GUI comprising of selectable and scrollable contents. The graphics may comprise of various known components for rendering and displaying graphics on touch screen 112, including components for changing the intensity of graphics that are displayed. Graphics may collectively include non limiting of icons, images, videos, symbols, animations, text etc. Although in some or alternate embodiments, device100 may comprise of menu driven or line based or alike displayed user interface. When one or more user inputs are given for selection of selectable content then codes executed by the processing unit facilitates with output based on provided user input with respect to the process/content. Selection of feature and/or function through one or more user input may open UI page for selected activity with respect to feature and/or function, and/or may execute codes of said activity in background. The executed code for activity may generate and give output by any suitable means like as visual output, sound based output and vibrations, based on content. The access to GUI content can be implemented by any suitable means. In exemplary embodiment, user profile home page comprises of selectable data type icons which can be selected to open data type main page. Further settings icon and features can be accessed through drawable menu. Once particular content is selected by user input, codes executed in background may display UI page of said content. The settings UI page comprises of selectable content laid in categorical manner. In some embodiments, GUI layout may be different like that of full screen swipe based icons

(where each data type may have a separate page), list (where data types, features and settings are in list form) or may have changeable layout such that layout can be changed from settings menu by selecting from list of selectable options etc. Further, in different embodiments settings and options with respect to data type may be made accessible by any suitable means such as like providing shortcut/option button on page of data type or like by giving a user input on page may open options menu like that of haptic feedback mechanism based with long press or by any other suitable means. Similarly navigation through/in GUI may be implemented by any suitable means such as like through soft buttons(icons) like back, home page etc., gestures like swiping gesture may make page or content move to previous and/or next page, by providing fixed back and home button (physical) or by any suitable means. In exemplary embodiment, an option button and haptic feedback mechanism is provided to access options further a soft buttons are provided to go to previous and home page. Further output corresponding to alerts and notification, both system generated and user input based, may comprise of visual output, sound output, vibrations which can be selected from settings menu. Further in exemplary embodiment, to access one or more function and/or feature user input may also be given by virtual key, voice command, gestures based method like tapping, sliding, swiping etc., through push buttons. The settings for voice command and shortcut based input like those given by using virtual key, additional sensor and push buttons, can be accessed from settings.

Further in some embodiments, push buttons also facilitates in providing input in GUI, based on selected option. Further for entering data, exemplary embodiment comprises of soft keyboard. Further data can be entered by using text to speech feature by selecting mic icon from keyboard. In some embodiments, external device connected can be used for selecting text field and entering data. In some embodiment, text field takes input by text to speech synthesis only. In some embodiments, user may get options for selecting input method from list of available input methods when input field is selected or before the said process, or may have other default method or may have an option to select default input method. Further some exemplary embodiments, may work as exampled in respective module section. Some embodiments, may comprise of one or more of different input methods for receiving user input for instance by using different sounds like thumping, clapping etc., and/or by using one or more image sensor, for example by measuring change in pixel intensity input may be captured or any other method etc. The exemplary embodiment further facilitates with features with respect to device 100 functionality. The features may comprise of features related to convenience tools, device security and, device and system settings. In some embodiment, the features may be less or may be more or may have different sets of features as well, based on the types of embodiments.

The features for user convenience may comprise of features related to convenience tools with respect to device functionalities.

For example, some embodiments may facilitate with multiple input methods as disclosed. Some embodiments may comprise of soft keyboards, comprising of selectable icons which may be like those of standard (QWERTY) keyboard, which may have reduced number of keys related to keys in existing physical keyboards, which may for instance be in appearance alike to those seen in portable hand-held devices for giving user input. In exemplary embodiment, soft keyboard may auto pop up when writable field is selected by user input. In some embodiments, it may be implemented by different methods like those of input menu selection may be given for selecting type of input method. In some embodiments, default method can be set for input method from settings. The keyboard embodiments may be adaptive. For example, displayed icons may modify in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols. Keyboard embodiments may display predictive text for selection which may be tailored to word usage history (usage, lexicography). Further, keyboard may comprise of settings icon which may launch settings of keyboard. The settings of keyboard can also be accessed from settings icon., which comprise of language selection, selection of different keyboard layout like those of soft keypads, voice based input selector. In some embodiments, one or more keyboard may comprise of mic icon, which when selected may activate speech to text feature, such that spoken words gets converted into text and displayed in selected field. Some embodiments may comprise of keyboard which may comprise of character recognition ability, which may display content area for taking input and further recognize handwritten character to display result in text like for instance written content may provide input which may execute code for recognizing content and may display output accordingly. In such keyboards, if content is not recognized by content recognizer then it may either display nearest recognizable character from that of vocabulary library or may display an error notification, which may be given as sound or vibration or as text output or any combination thereof. Further keyboards embodiments and types may be adaptive to embodiments of devices 100. Another example is of random key generator which comprise of text-based characters like alphabets, numbers, special characters etc. for generating random codes, comprising of selectable components for generating codes based on user input, wherein user input may comprise inputs for selecting nature of character and/or selecting length of generated codes such that when one or more user inputs are given for key generation then codes executed based on user input displays generated code in text field. User input on generated code such as taping one or more time or long press, may copy code which user may paste in required field such as by giving a double or long tap on text field, which may open an option menu may for selecting, copying, pasting, deleting, cutting text from that field. The user may select suitable option. Further in some embodiments, random code generator may have a selectable option for auto suggestion selectable from settings, which when enforced executes codes for suggesting random codes while entering data in relevant text fields of templates. Another example is of virtual soft key, which when pressed (selected by input) displays shortcuts for selecting few features and/or functions, which when selected executes codes of selected options such as like for capturing snapshot of displayed

GUI, opening one or more additional display windows etc. The settings of virtual key may comprise selection for number of options to be displayed, feature and/or functions from list of options, which may be selected by provided controls in the GUL Another example of feature is ability to capture snapshots, wherein when input for capturing snapshot is given, codes are executed to capture snapshot of contents displayed in GUI, which may further be displayed on the page in form of thumbnail, which when selected enlarges snapshot and displays selectable contents comprising of editing, saving and discarding the captured snapshot. The save button saves snapshot in media gallery, discard button discards snapshot and edit button opens image editor for editing snapshot. The input for taking snapshot may be implemented by any suitable means such as like by using virtual key, push buttons or any suitable means. Another example is of device manager folder(s), comprising of media gallery (folder) from where multimedia-based data like snapshot, video, audio and other transferred data may be accessed. Further some embodiments facilitate in creating sub folder in device manager folder which can be selected from options menu of device manager folder through GUI, which when selected executes codes which displays output in GUI for creating, naming and listing folder, which when is created, is disposed in page. Further option menu also comprises an option for arranging folders such as sorting by name, size, user defined etc. Long press on some icons or thumbnails like folders or contents in device manager folder, may display options with respect to selected icon, where option may comprise of options for cutting, copying, pasting, deleting, moving selected item, if user selects copy or cut option then code for same may be executed and user may use swiping gesture on touch screen for navigating through folders of device manager folder, though in other embodiments navigation may be implemented differently, for example by giving a back button on top of displayed pages or any other method etc. The user can then paste copied content by giving long press at empty space in displayed GUI. The delete option displays confirmation text box with selection buttons like yes and no, which may be selected as per user choice. Another example is of split window display, which may be selected for instance from virtual key or pushbuttons etc., which when selected executes codes for generating and displaying split window on page, through which user may navigate for opening desired content. The split window may be adaptable, such as size, location in page may be adjustable. Another example is of search bar through which database of user’s profile and/or feature and/or settings and/or other associated components can be searched by providing user input in text field such that when user inputs are provided, processor executes codes to display relevant output in GUL Another example is of torch, which when selected by giving user input from GUI or from virtual key or push buttons, activates function of torch. In some embodiments, providing user input may open UI page with selectable contents comprising of on/off buttons, adjusting intensity of light. Another example is of digital magnifier, which when selected by user input from GUI or from virtual key, push buttons activates the function of magnifier by activating camera and displays content in GUL The GUI also displays some selectable contents with respect to magnifier, which when selected executes codes of selection such as zoom in and zoom out, activating flash light. Another example is of copying text to other computer readable format, which when selected executed codes displays

UI of selected feature. The GUI comprises of selectable options for selecting and creating text-based data from user’s profile database into other computer readable format, comprising of location selection tool which opens GUI for selecting location of storage (Internal and extern al), content selection tool for selecting content which opens GUI based on categorical content for making selection and format selection tools which opens the GUI for selecting type of format like xls, pdf etc. Once selection is made by giving user input and ok button is pressed, codes are executed based on user input to generate and store data in selected location. In some embodiments, same can be given from data type options menu. In some embodiments, selection can also be done by push button and/or virtual key and/or voice commands. Another example of feature is of printing which facilitates in printing user data comprising of text and images. When the selection for printing data is made from GUI, UI page of printing options gets open. The UI page comprises of selectable content comprising of printing tools for selecting the printer, content view and other settings. Once options are selected from GUI and print button is pressed, codes are executed for printing user data from connected printer. The printer can be connected through micro-usb module, by connecting printer through cable. In some embodiments, device 100 facilitates in pairing printer and other devices, like I/O devices, by connecting through wireless connectivity drivers such as for example, bluetooth dongle connected through micro usb port, and selecting pairing options from connectivity UI page. In exemplary embodiment, the device facilitates in connecting through cable. Some embodiments, may have a feature of content reader, which gives sound output of content displayed on GUI, such that when option for same is selected, for example from virtual soft -key or from settings menu, executed codes converts the text into speech for reading content displayed on screen. Another example is of voice command, which when activated facilitates in launching features and/functions based on user command. The voice commands feature may be activated through GUI or from push button by making selection from settings of push button. Another example is user profile, wherein some embodiments may facilitate in creating more then one user profile, option for which may be selected through GUI. The profile may be created either while configuring device 100 at initial startup or after resetting devices 100 to factory state or from settings menu, for creating new profile. The profile generation comprise series of steps with displayed selectable components, which when selected execute codes for generating profile based on given user inputs. The selectable options may comprise of options for naming profile and other security related options like authentication and authorization enforcement, recovery profile creation etc. In some embodiments, one or more options may be skippable. In exemplary embodiment, options cannot be skipped. After series of steps, new profile may be generated and user may be taken to new profile page. In some embodiments, after profile is generated option for profile selection may be displayed. In exemplary embodiments, the generation of new profile require the authentication enforcement on other profile also. In some embodiment, generation of new profile may not require authentication enforcement on other profiles. In exemplary embodiments, user may switch profiles from existing profile, which may require authentication for switching profile, with other profile authentication method. In some embodiments, user may not be able to switch profiles from logged in profile. In some embodiments, profiles can be switched with and/or without password. In exemplary embodiment, after device

100 is started user is prompted to get authenticated for accessing the features. In some embodiments, after device 100 is started, profile selection page is displayed for selecting the profile. Further, created profile can be deleted by logging into profile and selecting options from profile settings. In some embodiments, profile can be deleted from profile selection page and/or from other profile. Further in some embodiments, profile creation may also comprise of option for privilege selection like that of admin profile, which may give extra privileges for that profile such as using admin password for accessing and/or removing password of some locked feature of non-admin profile, deleting non-admin profile etc. In some embodiments, the first profile generated in device 100 may be default privileges of admin profile and other profiles created may or may not have an access for admin privileges. In some embodiments, device 100 may further comprise of scanning feature, to scan the documents through image sensor which may facilitate in capturing the data through image sensor.

Further, codes stored in memory may comprise of codes to facilitate with some security features with respect to devices 100 security policies.

In different embodiments, codes executed may facilitate with one or more security feature which may be system-enforced and/or selectable for enforcement through GUI.

For example, memory may comprise of executable codes to facilitate with enciyption protocols, which when executed may encrypt memory and/or computer programs and/or profile’s associated data and/or only user data. The enciyption codes may be implemented by any suitable methods. In some embodiments, processor executes codes which enforces enciyption when device 100 is initially started and/or after device is rested. In some embodiments, processor executes codes which displays a prompt with selectable content for enforcing encryption, which in some embodiments is displayed as optional selection with selectable icons for enforcing or skipping encryption, and if option is skipped by providing user input, then it may be accessible from setting menu, where settings menu may comprise of options for enforcing encryption, where options may comprise of options to enforce encryption such that memory may get encrypted along with stored content, and/or computer programs and associated profile data may get encrypted, and/or only profile and associated data may get encrypted and/or only user data may get encrypted. In some embodiments, there may be options for enforcing extra layer(s) of encryption(s) such that associated profile data and/or only user data and/or entire memory may get encrypted after selecting and enforcing option from GUI. In some embodiments, associated profile’s data and/or user data automatically gets encrypted with one or more extra layers of encryption, after profile generation. In some embodiments, encryption process may auto trigger authorization protocols, such that GUI may display options for selecting one or more authentication method, selectable and enforceable by user input, for getting authenticated for decrypting devices 100 and/or user profile and/or user data. In some embodiments, displayed options triggered for enforcement of authentication may be provided with options to skip enforcement of authentication methods. In some embodiments, devices 100 may not facilitate with encryption protocols. In exemplary embodiment, encryption is auto enforced which also encrypts the associated profile’s data with extra layer of encryption.

Another example of security feature is authentication protocol, which facilitates user in getting authenticated. The memory may comprise of codes which when executed may facilitate with GUI comprising of selectable contents for selecting and enforcing authentication methods. In some embodiments, the authentication methods comprises of non-biometric methods such as text based, knowledge based, external hardware based etc., for example, pattern recognition based, text recognition based, text -generation based, text-generation time dependent based etc. such that when one or more methods are selected and enforced by giving one or more user inputs, user inputs are determined by executed codes to whether inputs given are responsive to selected authentication method and if it is in accordance with authentication method then user inputs are stored in form of template in user profile data. To get authenticated, when user gives one or more input, for instance a swipe on screen, codes executed displays GUI as per stored template of enforced authentication method for selected process, for user inputs in accordance to enforced authentication method for that process, such that new inputs given by user for getting authenticated are matched with that of required inputs to determine whether inputs given are in accordance to inputs required for user to get authenticated as per the enforced authentication method, and if inputs are in accordance to that of required inputs, then access is granted to user. The exemplary embodiment comprises of password, numeric keys and patter-based authentication method. Further, exemplary embodiment comprises of biometric authentication methods, viz. Facial recognition, voice-based text dependent authentication and finger print authentication, where biometric prints of users are used to authenticate the user.

Another example of security protocol is authorization protocols comprising of single or multiple authorization protocols, wherein authorization protocol may comprise of methods and techniques which may at least define how user can be authorized, when and for which process i.e. function or feature or profile or associated profile data or profile user data, does user has to get authorized, what access does a user holds after getting authorized, how many attempts may user have to get authorized. For authorizing and granting access to user, user may be authenticated by one or more authentication methods. For example, in some embodiments codes executed may facilitate with single authorization protocol method, comprising of system defined processes for which authorization protocol can be enforced by using one or more different and/or same authentication methods for one or more different processes. In some embodiments codes executed may facilitate with multiple authorization protocol methods, comprising of system defined processes and/or user selectable processes for which authorization protocol can be enforced by using one or more different and/or same authentication methods for one or more different processes. The processes may comprise of process with respect to device functionality, features, profile and user data, for instance to unlock the device after it is started or to unlock GUI for accessing any feature or a function or to access user data or to access any particular feature or function etc., Further in some embodiments, if user fails to get authorized codes may be executed which may be one or more user enforced and/or system enforced, such that user may not be able to access that process and/ or any process, till user gets authenticated. Further in some embodiments, if authentication of user fails, then it may trigger an auto reset condition, which may be pre-determined system condition or may be triggered as per enforced policy selected and enforced from settings and/or enforced from GUI at time of enforcement of said protocol. The auto-reset condition may comprise of parameters for erasing and/or resetting user profile and/or associated data, and/or erasing or resetting entire profiles and/or associated data, and further may comprise option for selecting number of attempts which user may have before condition gets triggered or same may be predetermined system condition. In some embodiments, failed attempts may trigger condition of recovery protocols which may or may not be system-enforced and/or user may have either enforced it while configuring and/or creating user profile and/or from profile settings. In exemplary embodiment, the user can enforce the authorization for data access which can be enforced by either using the pre-enforced method and/or new authentication method. Further, recovery profile generation option is selectable in exemplary embodiment.

Another example of security feature is recovery protocol which may be either system enforced and/or may be enforced by user at time of profile generation and/or at time of enforcement of one or more authentication methods and/or later from settings. The recovery protocols comprise of techniques and defines how access to one or more process can be recovered, on which authentication methods has been enforced.

For instance, when recovery protocol is triggered user may recover access to that profile and/or data and/or feature and/or function by getting authenticated through one or more of enforced recovery methods, wherein recovery method comprising of creating recovery profile by using one or more methods including but not limited to one or more authentication methods facilitated by devices 100 along with other text based authentication methods like system generated text code and/or by answering selectable questions which may be system generated and/or definable by the user. Further, in some embodiments, one or more profile may be recoverable from admin account, such as for example, admin account may have authorization to remove authentication process implemented on the profile or its associated process or may have privilege to login into other profile accounts through admin password. In some embodiments, devices 100 may facilitate with one or more admin profiles which may have some and/or all, system generated authorization privileges and/or privileges are given to admin account from settings of nonadmin accounts. In exemplary embodiments, the recovery profile can be generated through biometric authentication methods and other methods viz., password, key and patter.

Another example of security feature is of text masking where at least while entering the text, the text entered in the text field may get masked such that the entered text is not visible unless the option may be selected from the GUI for unmasking. In some embodiments, the masking feature is auto enforced, like in those of template-based UL In some embodiments, the masking feature is user enforceable, by accessing its settings from GUL In exemplary embodiments, the masking is auto enforced.

In some embodiments, security feature may also comprise of features with respect to storing security token in external hardware device. In different embodiments, same may be implemented on one or more different processes which may be implemented by any suitable method and means, such that security token may be exchanged when the devices 100 gets connected to external device for authentication, such as for example, while implementing authentication method, like on encryption process or any authorization process etc., memory may comprise of executable instructions which may display prompt for connecting security device, which when connected, executed codes may store security token on connected external device by any suitable means, such that when process needs to get decrypted (user authorization) then after said security device is connected with device 100, said process may accept security token to decrypt device 100 accordingly.

The features for device usability and device operations comprise of features related to settings for personalization, display, shortcuts, accessibility and operations. The same is accessible by selecting settings icon displayed in the GUL The layout may be given by any suitable means, for example, list based where each item is listed on single page or suitably characterized etc. When the input for a particular option is provided, it executes codes for displaying GUI of selected option. Further options selected for a particular setting executes code based on provided user input for selected option. For example, settings menu may comprise of system information selection, which when selected executes codes to display information page with respect to storage information, battery information and other information related to devices

100. Another example are settings for language, region, date and time, from where the user input given for an option display page for selecting provided options with respect to content such as date and time, language for displayed content and region can be changed, which when selected and/or enforced by one or more user input, may execute codes to put selection into effect. Another example is of display settings which comprises of settings related to content displayed on screen such as magnification, text size, boldness, font, line spacing, auto brightness selection, orientation setting, changing theme like to display background and content in contrast like dark or light. In another example, basic functions can be selected and performed like powering off, locking GUI, restarting, resetting to original factory state, turing on/off display lights, deleting profile, erasing database of profile, resetting profile settings to default, In another example, screen saver option can be selected to select the wallpaper, displayed style of notifications and content for locked screen. In another example, shortcut selection can be made from shortcut selection menu for launching features and/or functions with one or more input method, provided through push buttons and/or gesture based user inputs. In another example, the settings for speech to text synthesis can be accessed from accessibility settings, which comprises of language selection option, speed selection option which when selected by user input executes the codes to put the selection into effect. In some embodiments, the synthesized voice language gets auto-selected based on default region and/or language selection. In another example, voice command settings can be selected through accessibility menu, which comprises of list of voice commands with option for selection and deselection of components, edit and/or change the phrases. Further, in some embodiments, the device 100 may also facilitate in generating the voice command for a particular function or a feature. In another example, shortcut settings can be accessed for virtual soft -key from accessibility settings which comprises of options comprising of selection of feature, selection of quantity of features which can be accessed through soft-key. The security settings comprises of authentication settings and authorization settings. The authentication settings comprises of authentication methods and their settings. The authentication methods can be activated/deactivated through their respective setting pages.

When any of authentication method is selected from list of available methods, codes executed by selection displays the page of selected method, with instructions to be performed by the user. After one or more inputs provided by the user the authentication method may get enforced. If the user may provide incorrect input the codes may display the user with error message and may further ask for the correct inputs.

To deactivate the authentication method, the user may select the option from the authentication method page from which the authentication method was implemented. The authorization settings comprises of the process list(features and functions) on which user can enforce an authentication method. The authorization list comprises of selectable button, which can be toggled by user input to enforce the authorization method. In different embodiments, device 100 may facilitate in enforcing one or more different authentication methods for one or more different authorization processes and/or same authentication methods for all authorization processes.

When devices100 are initially started or after reset, one or more processor executes codes stored in memory and GUI is displayed on screen.

In some embodiments, the displayed GUI, displays configuration page comprising of selectable component which when selected executes code and displays output with respect to provided user input. The configuration comprises of series of steps to configure devices 100, comprising of language, date and time, profile generation, security, some display settings. In some embodiments, devices 100 may not have configuration settings or may have different sets of settings. In some embodiments, some or all steps may be optional and can be skipped. In some embodiments, the displayed options may not optional to be skipped. In exemplary embodiment, the displayed selectable components are optionals which can be skipped and accessed later from settings. If steps are skipped, a default profile is generated and home page is displayed. The newly generated profile’s home page starts with user demonstration guide which highlights selectable contents comprising of data types and drawable menu comprising of features and settings icons. Further, GUI demonstrates and suggests methods and techniques for storing data and using features. Further user guide is listed in help section of settings menu which details about every feature and function facilitated and also details about how to perform them. In some embodiments, the display guide and/or GUI demonstration method may not be their. The drawable menu comprises of selectable contents as icons of few features and device and system settings. It is a slidable and scrollable menu, in which features related to user convenience are listed. In other embodiments, it may be implemented differently. Further it may be adaptable to get changed based on the user input, like long press may facilitate the user to change the arrangement of the displayed features. The selectable components comprise of random key generator, digital magnifier, torch feature, device manager folder profile generation icon, printing options, search bar, screen reader, settings icon.

At Profile’s home page, when an input is given for particular data type icon, home page of that data type is opened. Home page of data type comprises of an add button“+” in form of an icon. When“+” is selected by user input, it creates a category for that data type and displays text field to name it. Once named it is placed suitably in data type home page as a selectable tab or icon. Modification option may be opened by selecting displayed options icon for opening options comprising of renaming, deleting, color selection(Tagging), sorting option, filtering options etc. In some embodiments, same is implemented through haptic contact, where a long press opens the options menu for the user. After category creation, when corresponding category page is selected and opened,“+” icon is displayed which when selected displays options to add sub-category or add data. If data is added then options for creating further sub category disappears. If data is selected the selectable menu opens which comprises of templates for adding the data. The user may select the option as per the requirement and may enter the data. The said data page comprises of“+” icon which facilitates in adding further row and/or column as per selected option. After entering data by selecting the save button, a window for naming and saving data pops up. The data is placed appropriately by the device 100 in respective location which when selected and opened, displays the stored data. Further there is an option to associate an image with text, which may be selected from corresponding text data options. Further an alarm may be added with respect to added reminder or event which may be displayed appropriately as a notification at the time of event. The notification style options can be accessed through device and system settings. The options menu of the data type comprises of colors options to distinguish the categories and versions thereof, options to rename, delete, move, copy, sort by type and custom sort. Further in some embodiments, there are custom options for user for storing text based data. The text fields are adaptable for usage such as, the user can select the custom option to adjust the text field size or to add a separate row or column. In alternative embodiments, the options may comprise of one or more of other options or no such options may be available. In sound-based data, video-based data and image-based data, add button opens respective media capturing page, comprising of plurality of icons for media controls, media type selector(video or image), camera selector(front or back, internal or external) and corresponding options icon comprising of selectable options for image size selector, view type selector, flash selector, timer selector, storage selector (location of storing the media- internal or external comprising of folder selection and/or creation), shutter sound selector, focus mode selector, image stabilization selector etc. Similarly, the media capturing page of audio data comprises of media controls and options icon, where options comprises of option for file format selector, quality selector, storage location selector, mic selector(intemal or external). In some embodiments, the location for storing a particular data type can also be selected through options menu of that particular data type for both internal and external storages. In some embodiments, location of data type cannot be changed. Once media is captured, GUI displays a window comprising of text field with options for naming it or discarding it and accordingly stores the media and displays in the form of a thumbnail on media capturing page. If media thumbnail is selected then media player opens with plurality of icons for controls, delete and edit icon. The controls are similar to those of respective media viewer and/or player. The image viewer displays the images and can be swiped to go to next or to previous media. An edit icon is their, the quick user input to which launches the editor controls. The audio and video player comprises of play icons, stop icons, skip icons, previous icons, rewind and forward icons. Icons of the player are adaptive, such as for example, play icon changes its function to pause icon when the media is playing. In some embodiments, the audio player and video player shares a common media player platform. Edit icon opens page of media editor corresponding to data type. In the exemplary embodiments, the media editors comprises of common editing tools which facilitates in editing the media. The image editor comprises tools such as those of image enhancement and transformation like filters, contrast, hue, image cropping, joiner (joining 2 or more pics). The audio editor comprises of tools like audio trimmer, file format converter, joiner for joining 2 or more files etc. The video editor comprises of tools related to video enhancement and transformation like video trimmer, video joiner etc. The media player and media editor takes swiping gesture to move to next or previous media. In some embodiments, input for moving to next to previous media can also be given by using the push button. The push button is also used for launching the functions and features. In some embodiments, the quick press of push buttons activates and deactivates the voice recording, image capturing and video recording features. In some embodiments, push buttons are also used to facilitate with media player functioning, such as like launching, playing, pausing, move to next or previous. In some embodiments, the push buttons (additional input interface module 104) gives haptic feedback when the feature and/or function is launched. In some embodiments, synthesized sound based feedback is received by user. In some embodiments, both sound based feedback and haptic feedback is received by the user. In some embodiments, the said input interface also facilitates in launching the function or feature by authorizing the user through authentication, such as like a feature on which authorization method has been implemented, input given by user for that activity may generate an input for process comprising of implemented authorization method which will activate biometric authentication, for example facial recognition biometric sensor module or any eye scanning like iris recognition biometric sensor module, to authenticate the user and if the user is the user who is an authorized user the user may get authenticated and the feature and/or function may get activated. If the user is not an authorized user an alert may be generated which may be in the form of visuals and/or sound and/or vibrations.

The user can connect to external devices such as for transferring the text-based data by using memory card26 and micro usb. In some embodiments, external memory device from UI page of device manager folder can be selected to see the content stored in external memory only and/or to check the available storage space. In some embodiments, memory card may not support copying of external data. In some embodiments, the device may facilitate to encrypt one or more external storage device.

References given throughout specification by way of examples to a particular feature, function, structure or characteristics facilitated by different modules described in connection with the embodiments means that the feature, function, structure or characteristics is included in at least one embodiment of the present invention. Thus, the examples given throughout this specification may, but do not necessarily, all refer to the same embodiment. The above example was just to demonstrate how one or more embodiment of devices 100 can be implemented and not to give conclusive view of the embodiments.

The following examples pertain to the above or further embodiments which demonstrate how the devices 100 with the respective modules may facilitate to capture, store, list, edit, display and delete different types of data.

Example 1 corresponds to devices 100 which facilitates in displaying the GUI and taking the user input for capturing, storing, listing, editing, moving, accessing the text based user data, or audio and video based user data, or any suitable combination of text based user data, audio based user data and video based user data, and further to facilitate with one or more features.

Example 2 corresponds to device 100 of example 1 , which comprises a sound input module for capturing sound based user input, where sound based user input may input for user data and/or for other features and/or functions facilitated by devices 100.

Example 3 corresponds to devices 100 of example 1-3, which comprises a sound output module 107 for giving sound based output, where sound based output may be sound output of user data and/or for other features and/or functions (system generated output and/or output generated based on user input).

Example 4 corresponding to devices 100 of example 1-4, which comprises a camera module108, for capturing visual based input, where visual based input may be input for user data and/or for other features and/or functions facilitated by devices 100.

Example 5 corresponding to devices 100 of example 1-4, which comprises of one or more additional input interface module, comprising of one or more additional input interface which may or may not be function specific or feature specific or both function and feature specific.

Example 6 corresponding to devices of any of example 1-5, which comprises of external device interface module, comprising of at least one external device interface to facilitate in connecting with external devices, where external device may be input/output devices.

Example 7 corresponding to devices of example 6, in which facilitates in accessing external storage devices, for transferring data to/from external devices, wherein device facilitate to either store, list, edit, move, merge(text), access and delete or any suitable combination thereof, compatible audio and/or visual based data and further facilitates to store, list, displaying, edit(name etc.) and delete unsupported format data.

Further device 100 further facilitates to transfer text-based data into other computer readable format and other stored data to external devices.

Example 8 corresponding to devices of any of example 6-7, which facilitates in connecting with external devices for providing sound based output and/or for capturing and storing visual based user data and/or for capturing audio based data and/or for authenticating the user and/or for taking user input and/or for displaying GUI, and/or for printing the data and/or for updating the codes, wherein if external interface facilitates in capturing visual based user data and/or audio based data, then devices 100 may further facilitate in storing, listing, editing, deleting, moving and accessing captured user data.

Example 9 corresponding to devices of any of example 1-8, which comprises of additional sensor module109, comprising of at least one additional sensor, to facilitate with their respective functions and features.

Example 10 corresponding to devices of any of example 1-9, which comprises of one or more light source module111, comprising of one or more light source, to facilitate with their respective functions and features.

Example 11 corresponding to devices of any of example 1-10, which comprises of one or more biometric sensor modulesl 10, comprising of one or more biometric sensors, to facilitate in authenticating user by their biometric prints.

Example 12 corresponding to device of any of example 1-11, which comprises of hardware encryption module to facilitate with hardware- based encryption.

While various embodiments of present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. Referring above examples it should be understood that even after a module is employed in devices 100, the devices 100 may not render all its applications or its functions & features. For example, devices 100 of example 12 where all modules are employed, may not facilitate to capture the audio and visual based user data but may facilitate with authentication methods by using camera and sound input mechanism, for only biometric authentication. In same example, devices100 may further facilitate to store, edit, access & delete text based data and further facilitate to store, list, play, edit, access and delete compatible audio and/or visual based data transferred from external storage devices and/or captured by using external devices. In another example, embodiments may not comprise of internal sound input medium and camera modules, and may facilitate in capturing, storing, editing, listing, deleting and accessing text-based data. In same example, device may be able to capture and store visual based data and sound-based data using external devices connected through such as, USB port and may further facilitate to list, display, edit, play and delete compatible audio & visual based data transferred from external storage device. Similarly, embodiments can be carried out in different manners using different components like device may employ different category or sub-categoiy of modules in different embodiments. Another example is in which the device may comprise of a touch screen unit and hardware-based keyboard having power ON/OFF button, integrated in the keyboard. In another example, the device 100 may take the input from microphone and additional input interface module to navigate through GUI and may facilitate with screen-based display maybe with a projection or holographic or an external interface for displaying GUI. Similarly, the player and/or controls and/or editors respective features and GUI can be different in different embodiments. Such as more options can be provided in the media editor or GUI can be implemented to highlight a certain feature based on the type of embodiments. Similarly, the audio and visual user data type embodiments, can have different types of components or control sets. Few components can be more in number or less then demonstrated above and can further employ different sub-categorical modules to facilitate with different features and functions. The above embodiments were just to demonstrate how a device 100 can facilitate in capturing, storing, accessing, editing, deleting, transferring the user data where the user data was text-based user data and any suitable combination of text-based user data, audio-based user data and video based user data. In some embodiments, device

100 may be a single unit for visual based user data or audio based user data for facilitating with different functions and features or may have different sets of features like for example, opening text based data of some different file formats transferred from external device such as through document viewer or may comprise optical character recognition feature for scanning and converting the documents etc. Similarly update for computer programs can also be given by any suitable means such as for example by using memory device, by connecting with computing devices, where computing device may for example, comprise of software suite for updating and/or transferring user data or can be done by any suitable method for example OTA or any suitable method based on the type of embodiments and/or usage. The above examples were just to demonstrate the possibilities of the embodiments, and not to restrict them to the only one form of embodiments. Thus, breadth and scope of present invention should not be limited by any of the above described examples.