Anyone ever walk into a room and forget why on Earth you were there? Were you about to get a cup of coffee, or get your car keys? Wonderful! It’s frustrating on my level of distraction, now magnify that to the Nth degree, Alzheimer’s. Apply a rules and Induction engine, and poof! A step further away from a managed care facility.
Teaching the AI Induction and rules engine may require the help of your 10 year old grandson. Relatively easy, you might need your grandson to sleep over for a day or two.
It’s all about variations of the same theme, tag a location, a room in an apartment, also action tag, such as getting a cup of coffee from the kitchen. The repetitive nature of the activities with a location tag draws conclusions based on historical behavior. The more variations of action and coinciding location tags, will begin to become ‘smarter’ about your habitual activities. In addition, the calculations create a bell curve, a way to prioritize the most probable Location/Action tags used for the suggested behavior. The ‘outliers’ on the bell curve will have the lowest probability of occurrence.
Beyond this ‘black box’ small, lightweight computer (smartphone) integrate a Bluetooth, NFC, WiFi antenna, a mobile application and you’re set. A small, high quality Bluetooth microphone to interact with the app. There’s also potential for exploring beyond the home.
Kidding, you don’t need that Grandson to help. Speak into the mic, “Train” go into the room and say your activity, coffee. This app will correlate your location, and action. Everyone loves to be included in the Internet of Things, so app features like alerts for deviation from the location ‘map’ are possible.
In earnest, I am mostly certain that this type of solution exists. Barriers to adoption could be computer/ smartphone generational gap. Otherwise, someone is already producing the solution, and I just wasted a bus ride home.
Additionally, this software may be integrated with Apple’s Siri, Google Now, Yahoo Index, Microsoft Cortana, an extension of the Personal Assistant.
OUR STUDENTS | The University of Delaware’s new PhD program in Financial Services Analytics (FSAN) has gotten off to a strong start.
“I love how UD was able to merge finance, data mining, statistics and other areas to create the FSAN program,” said Leonardo De La Rosa Angarita, a current FSAN student. “This is also reflected in the diversity of the students. We come from different fields, and it is wonderful how we are able to complement each other in so many different ways.”The unique program, a collaborative effort between JPMorgan Chase & Co., the Alfred Lerner College of Business and Economics and the College of Engineering, teaches students to become experts at researching and analyzing large swaths of electronic information, known as “big data” in the business world.
Long Chen, another FSAN PhD student, said, “After I joined the program, I expected to learn about a broad range of disciplines to fill my toolkit, and that is exactly what we are doing—taking courses from the areas of finance, statistics and computer science. Although the program just started, I can tell we are heading in the right direction.”
Bintong Chen, Director of the FSAN program, said that students are trained as researchers and professionals who play key roles in interdisciplinary teams, applying their knowledge and skills to convert vast amounts of data into meaningful information for businesses and consumers.
Bintong Chen added that the first semester put students’ skills to the test, “due to the intensity and breadth of the core classes designed for the program, ranging from very technical subjects, such as machine learning and data mining, to very business-oriented topics about financial institutions.”
Students are interacting with JPMorgan’s Corporate and Investment Bank during the spring semester to identify topics for their research projects and potential summer internships.
“I always wondered what would happen if engineers and economists would speak the same language, if professors would be more open to the world outside the walls of their offices and if industry would get more interested in what we study in our classrooms,” said Eriselda Danaj, another student in the program. “It is challenging and I love it.”
Microsoft’s Skype began rolling out an ‘upgrade’ to replace Lync yesterday. Are there any additional revenue opportunities for Skype going beyond the Business licensing fee for Office 365?
Look no further than the artifact of a chat, the conversation. This applies to any chat solution. Getting beyond the ‘privacy’ issues, the Skype app may directly feed into a blog using a blog plugin.
This plugin will allow a user to connect to a Skype account. User defined settings would store Skype creds. in the Blog ‘Settings’ menu.
The plugin will install a new type of Blog ‘object’ called the Chat Conversation, or just Conversations. For each Skype chat log imported, there will be a correlated [post] conversation. Once there is a conversation posted, the Blog Admin may go into the ‘Conversation’ post, and update the post with any tags they see fit. The imported Conversation is, by default, set to a status of Pending. An accompanying widget will be installed, giving the blog a ‘Skype Conversation’ sidebar widget. The widget produces a ‘Conversations’ Tag Cloud. If a tag is selected, the UI will list any conversations that contain the selected tag.
There may need to be a disclosure manually on the ‘Conversation’ [post]. For each post, the admin would need to check the box that says as required by law, all participants of this conversation were notified the conversation was recorded and stored with public access. Conversely, ‘Conversations’ may be set to password protected upon upload.
Alternatively, export the text of the conversation from Skype, import into blog post, tag, and update with Public, or Password Protected. The drawback is the conversations are now bucketed with all the posts, and conversation tags are not separated from other posts. A ‘Conversation’ may not only contain text, but audio, video, desktop sharing, etc. as well. The limits may be on the Skype client, and what can it export. Also, this implies the Skype Conversation and all of its components (audio, video, text) may be saved either on the desktop, or the Skype ‘Cloud Data Services’.
In the last 20 years, I’ve observed technology trends, and Tech achievements have risen and fallen from the mainstream. Tech has augmented our lives, and enhanced our human capabilities. Our evolution will continue to be molded by technology and shape humanity for years to come.
Everything you might find on your computer from emails to video are digital assets. Content from providers, team collaboration, push and/or pull asset distribution, and archiving content are the workflows of DAM.
DAM solutions are rapidly going main stream as small to medium sized content providers look to take control of their content from ingestion to distribution. Shared digital assets will continue to grow rapidly. Pressure by stockholders to maximize use of digital assets to grow revenue will fuel initiatives to globally share and maintain digital asset taxonomies. For example, object recognition applied to image, sound and video assets will dynamically add tags to assets in an effort to index ever growing content. If standard taxonomies are not globally adopted, and continually applied to assets, digital content stored will become, in essence, unusable.
All devices across all business verticals will become ‘Smart’ devices with bidirectional data flow. Outbound ‘Smart’ device data flow is funneled into repositories for analysis to produce dashboards, reporting, and rules suggestions.
Inbound ‘Smart’ device data can trigger actions on the device. Several devices may work in concert defined by ‘grouping’ e.g. Home: Environmental. Remote programming updates may be triggered by the analysis of data.
AI Rules Engine runs on ‘backend’. Rules defined by Induction, through data analysis, and human set parameters, executed in sequence
Device optimization updates, presets on devices may be tuned based on ‘transaction’ history, feedback from user, and other ‘Smart’ devices.
Grouped ‘Smart’ devices, e.g. health monitors’ data uploaded, analyzed, and correlating across group. Updated rules, and notifications triggered.
Cloud ‘Services’ enables scalability on demand, relatively lower cost [CapEx] overhead, offsite redundancy, etc. Provides software solutions companies to rapidly deploy to Dev., Test, and Prod. environments. Gaming, storage, and virtual machines are just a few of the ‘…as a service’ offerings. IoT analysis may reveal a new need for another service.
Integrates user to surrounding environment with overlay images to your eyes to REpresent anything, e.g. Identifies surrounding people with Twitter handle/user name above their heads. Interacts with smartphone for Inbound and outbound data flow. May allow App and OS programmers to enable users to interact with their ‘traditional’ software in new ways, e,g. Microsoft Windows 8+, current interaction with ’tiles’, may shift from a two to three dimensional manipulation and view of the tiles. Tiles (apps) pop up when, through object recognition, predefined characteristics match, e.g. Looking at a bank check sent to you from the mail? Your Bank of America tile / app may ask if you want to deposit the check right now?
Virtual Reality, V.R.
As more drones, for example, collect video footage, may be used for people to experience the landscapes, beaches, cities, mountains, and other features of a potential destination, which may lead to tourism. In fact, travel agencies may purchase the V.R. Headsets, and subscribe to a library of V.R. content. Repository platform would need to be created. Specs for the ‘How To’ on collecting V.R. Video footage should be accessible. Hathaway real estate offers a V.R. tour of the house, from their office.
Autonomous Vehicles (Average Consumer or hobbyist)
Driving forces to integrate with society puts pressure on individuals to integrate with the collective social conscious. As digital assets are published, people will lunge at the opportunity to self tag every digital asset both self and community shared assets. Tagging on social media platforms is already going ahead. Taxonomies are built, maintained and shared across social media platforms. Systematically tagged [inanimate] objects occur using object recognition. Shared, and maintained global taxonomies not only store data on people and their associated meta data, (e,g, shoe size, education level completed, HS photo,etc.) but also store meta data about groups of people, relationships and their tagged object data.
The taxonomies are analyzed and correlated, providing better, more concise demographic profiles. These profiles can be used for
Clinical trials data collection
Fast identification of potential outbreaks, used by the CDC
The creation and management of AI produced Hedge Funds
These three dreaded words you are guaranteed to see more and more often. As all aspects of our lives become meta data on a taxonomy tree, the analysis of information will make correlations which drive consumers and members of society ‘out of compliance’. For example, pointers to your shared videos of you skydiving will get added to your personal taxonomy tree. Your taxonomy tree will be available and mandatory to get life insurance from a tier 1 company. Upon daily inspection of your tree by an insurance AI engine, a hazardous event was flagged. Notifications from your life insurance company reminding you ‘dangerous’ activities are not covered on your policy. Two infractions may drive up your premiums.