Lifelogging is the process of digitally lifelogging life experiences. A lifelogging system usually comprises a lifelogging device which automatically and continuously captures the user's activities in the form of text and/or sensor data, such as image, audio or video lifeloggings which are stored and organized for future use. People may want to log their activities for their own enjoyment, for example for keeping a diary or being able to retrieve and share personal experiences. Lifelogging can be used for medical purposes, for example as an aid for people suffering from memory impairment.
Samsung patent application US20150169659 illustrates a system for generating a user lifelog by recognizing user activities from data acquired by a sensor, analyzing the recognized activities, and generating activity patterns. Data included in the lifelog can be processed using a preset daily task summary template for providing various services based on the lifelog. For example, an activity plan table can be automatically generated based on the lifelog during a predetermined period of time. A quantity of activities included in the lifelog can be calculated, and an exercise plan table can be automatically generated based on the calculated quantity during a predetermined period of time. Frequencies of the activities included in the lifelog can be calculated, and recommended activities may be automatically generated during a predetermined period of time based on frequently performed activities. In addition, the activities included in the lifelog can be stored with the associated time and location, and a record of past activities can be generated related to a specific location.
Samsung patent application US 20140192229 illustrates a system for determining emotion of a user who uses content and adding the emotion to content. Smartphone can provide many services using emotional information included in content such as, for example, a photo (smartphone can extract user's emotion from image information of the user obtained via a camera). For example, smartphone can display an emotional icon on goods information to which the user's emotional information has been added in a purchasable goods list. As another example, in case of displaying detailed information of goods from which the user's emotional information has been extracted among purchasable goods, the smartphone can display the user's emotional information.
Furthermore, smartphone can measure a stimulus degree of the user's sympathetic nerve and parasympathetic nerve to estimate the user's emotion. In this case, a smartphone includes a skin electricity measurement sensor for measuring user's skin. For example, smartphone can measure user's skin electricity using the skin electricity measurement sensor while playing the music. After measuring user's skin electricity, the emotion extract program of the smartphone can estimate user's emotional information using the user's skin electricity measured by the skin electricity measurement sensor. If a skin electricity value exceeds a reference emotional value, the emotion extract program can recognize that smartphone has extracted user's emotion for music content. Additionally, the electronic device may add relevant emotional information at a point of extracting user's emotion while reproducing music content. Smartphone also can estimate user's emotion using user's movement pattern measured by a motion sensor such as an acceleration sensor, a gravity sensor, and the like.
Future smartphone will provide 3D holographic projections. Samsung patent application US 20150220058 illustrates a hologram display smartphone that can display a holographic image without a separate light source (laser diodes, light emitting diodes, organic light emission diodes) for illuminating the hologram, and therefore, can be used in mobile devices such as smart phones.
Future smartphone will record, replay and transfer the digitized information of the sight, hearing, touch, smell and taste senses. US20150220199 illustrates a device for recording and reproducing five senses. The device obtains and stores sight, sound as well as touch, smell, taste information, which permits recreation of touch, smell, taste experience in addition to the sight sound (or both) information during playback. Touch signals recorded can include feel, temperature, shape, texture, hardness, and humidity. Touch and smell recorded can include a new representation of an object, event, or environment. An electronic nose can be integrated and used to detect odor as well, and such data can be stored and registered. A data representation that includes video images, sound, and touch can also include odor, and specific touch related details such as shape, texture, hardness, temperature, moisture (humidity), etc., to permit recreation of the object using playback devices that take into account three or all four modes of image, sound, smell, and touch representation.
US20140340206 illustrates the sensory messaging systems that enable users to transmit sensory messages. The system may cause the sensory messaging device to convey the sensory message by causing activation of a sensory message-conveying component of the sensory messaging device. Various sensory messaging conveying components can include vibrational components, temperature changing components, aroma emitting components, light emitting components, sound emitting components, and shock emitting components.
©2015 TechIPm, LLC All Rights Reserved http://www.techipm.com/