Tag Archives: Oracle

FinTech: End to End Framework for Client, Intermediary, and Institutional Services

Is it all about being the most convenient,  payment processing partner, with an affinity to the payment processing brand?  It’s a good place to start; the Amazon Payments partner program.

FinTech noun : an economic industry composed of companies that use technology to make financial systems more efficient

Throughout my career, I’ve worked with several financial services  teams to engineer, test, and deploy solutions.  Here is a brief list of the FinTech solutions I helped construct, test,  and deploy:

  1. 3K Global Investment Bankers – proprietary CRM platform, including Business Analytics, Business Objects Universe.
  2. Equity Research platform, crafted based on business expertise.
    • Custom UI for research analysts, enabled the analysts to create their research, and push into the workflow.
    • Based on a set of rules,  ‘locked down’ part of the report would  “Build Discloses” , e.g. analyst holds 10% of co.
    • Custom Documentum workflow would route research to the distribution channels; or direct research to legal review.
  3. (Multiple Financial Org.) Data Warehouse middleware solutions to assist organizations in managing,  and monitoring usage of their DW.
  4. Global Derivatives firm, migration of mainframe system to C# client / Server platform
  5. Investment Bankers and Equity Capital Markets (ECMG)  build trading platform so teams may collaborate on Deals/Trades.
  6. Global Asset Management Firm: On boarding and Fund management solutions, custom UI and workflows in SharePoint

*****

A “Transaction Management Solution” targets a mixture of FinTech services, primarily “Payments” Processing.

Target State Capabilities of a Transaction Management Solution:

  1. Fraud Detection:  The ability to identify and prevent fraud exists within many levels of the transaction from facilitators of EFT to credit monitoring and scoring agencies.  Every touch point of a transaction has its own perspective of possible fraud, and must be evaluated to the extent it can be.
    • Business experts (SMEs)  and technologists continue to expand the practical applications of Artificial Intelligence (AI) every day.  Although extensive AI fraud detection applications  exists today incorporating human populated Rules Engines,  and AI Machine learning (independent rule creation).
  2. Consumer “Financial Insurance” Products
    • Observing a business, end to end transaction may provide visibility into areas of transaction risk.   Process  and/or technology may be adopted / augmented to minimize the risk.
      • E.g. eBay auction process has a risk regarding the changing hands of currency and merchandise.  A “delayed payment”, holding funds until the merchandise has been exchanged minimized the risk, implemented using PayPal.
    • In product lifecycle of Discovery, Development, and Delivery phases, converting concept to product.
  3. Transaction Data Usage for Analytics
    • Client initiating transaction,  intermediary parties, and destination of funds may all tell ‘a story’ about the transaction.
    • Every party within a transaction, beginning to end, may benefit from the use of the transaction data using analytics.
      • e.g. Quicken – personal finance management tool; collects, parses, and augments transaction data to provide client  analytics in the form of charts / graphs, and reports.
    • Clear, consistent, and comprehensive data set available at every point in the transaction lifecycle regardless of platform .
      • e.g. funds transferred between financial institutions may  have a descriptions that are not user friendly, or may not be actionable, e.g. cryptic name, and no contact details.
      • Normalizing data may occur at an abstracted layer
    • Abstracted, and aggregated data used for analytics
      • e.g. average car price given specs XYZ;
      • e.g. 2. avg. credit score in a particular zip code.
    • Continued growth opportunities, and challenges
      • e.g. data privacy v. allowable aggregated data
  4. Affinity Brand Opportunities Transaction Management Solution
    • eWallet affinity brand promotions,
      • e.g. based on transaction items’ rules; no shipping
      • e.g.2. “Cash Back” Rewards, and/or Market Points
      • e.g.3. Optional, “Fundraiser” options at time of purchase.
  5. Credit Umbrella: Monitoring Use Case
    • Transparency into newly, activated accounts enables the Transaction Management Solution (TMS) to trigger a rule to email the card holder, if eligible, to add card to eWallet

Is Intuit an acquisition target because of Quicken’s capabilities to provide users consistent reporting of transactions across all sources?  I just found this note in Wiki while writing this post:

Quicken is a personal finance management tool developed by Intuit, Inc. On March 3, 2016, Intuit announced plans to sell Quicken to H.I.G. Capital. Terms of the sale were not disclosed.[1]

For quite some time companies have attempted to tread in this space with mixed results, either through acquisition or build out of their existing platforms.  There seems to be significant opportunities within the services, software and infrastructure areas.  It will be interesting to see how it all plays out.

Inhibitors to enclosing a transaction within an end to end Transaction Management Solutions (TMS):

  • Higher level of risk (e.g. business, regulatory) expanding out service offerings
  • Stretching too thin, beyond core vision, and lose sight of vision.
  • Transforming tech  company to hybrid financial services
  • Automation, streamlining of processes, may derive efficiencies may lead to reduction in staff / workforce
  • Multiple platforms performing functions provides redundant capabilities, reduced risk, and more consumer choices

 Those inhibitors haven’t stopped these firms:

Payments Ecosystem
Payments Ecosystem

 

Human Evolution: Technology Continues to Transform Socieities for Generations

In the last 20 years, I’ve observed technology trends, and Tech achievements have risen and fallen from the mainstream.  Tech has augmented our lives, and enhanced our human capabilities.  Our evolution will continue to be molded by technology and shape humanity for years to come.

Digital Asset Management (DAM)

Everything you might find on your computer from emails to video are digital assets.  Content from providers, team collaboration,  push and/or pull asset distribution, and archiving content are the workflows of DAM.

DAM solutions are rapidly going main stream as small to medium sized content providers look to take control of their content from ingestion to distribution.  Shared digital assets will continue to grow rapidly.  Pressure by stockholders to maximize use of digital assets to grow revenue will fuel initiatives to  globally share and maintain digital asset taxonomies.  For example, object recognition applied to image, sound and video assets will dynamically add tags to assets in an effort to index ever growing content.  If standard taxonomies are not globally adopted, and continually applied to assets, digital content stored will become, in essence, unusable.

The Internet of Things (IoT)

All devices across all business verticals will become ‘Smart’ devices with bidirectional data flow.  Outbound ‘Smart’ device data flow is funneled into repositories for analysis to produce dashboards, reporting, and rules suggestions.

Inbound ‘Smart’ device data can trigger actions on the device. Several devices may work in concert defined by ‘grouping’ e.g. Home: Environmental. Remote programming updates may be triggered by the analysis of data.

  • AI Rules Engine runs on ‘backend’.  Rules defined by Induction,  through data analysis, and human set parameters,  executed in sequence
  • Device optimization updates, presets on devices may be tuned based on ‘transaction’ history, feedback from user, and other ‘Smart’ devices.
  • Grouped ‘Smart’ devices, e.g. health monitors’ data uploaded, analyzed, and correlating across group.  Updated rules, and notifications triggered.
  • Manual user commands, ad hoc or scheduled

… as a Service

Cloud ‘Services’ enables scalability on demand, relatively lower cost [CapEx] overhead, offsite redundancy, etc.  Provides software solutions companies to rapidly deploy to Dev., Test, and Prod. environments.  Gaming, storage, and virtual machines are just a few of the ‘…as a service’ offerings.  IoT analysis may reveal a new need for another service.

Human Interface

  • Augmented Reality A.R.

Integrates user to surrounding environment with overlay images to your eyes to REpresent anything, e.g. Identifies surrounding people with Twitter handle/user name above their heads.  Interacts with smartphone for Inbound and outbound data flow.  May allow App and OS programmers to enable users to interact with their ‘traditional’ software in new ways, e,g. Microsoft Windows 8+, current interaction with ’tiles’, may shift from a two to three dimensional manipulation and view of the tiles.  Tiles (apps) pop up when, through object recognition, predefined characteristics match, e.g.  Looking at a bank check sent to you from the mail?  Your Bank of America tile / app may ask if you want to deposit the check right now?

  • Virtual Reality, V.R.

As more drones, for example, collect video footage, may be used for people to experience the landscapes, beaches, cities, mountains, and other features of a potential destination, which may lead to tourism.  In fact, travel agencies may purchase the V.R. Headsets, and subscribe to a library of V.R. content.  Repository platform would need to be created.  Specs for the ‘How To’ on collecting V.R. Video footage should be accessible.  Hathaway real estate offers a V.R. tour of the house, from their office.

Autonomous  Vehicles (Average Consumer or hobbyist)

  • Cars 
  • Drones
  • Satellites 

Social Media Evolution

Driving forces to integrate with society puts pressure on individuals to integrate with the collective social conscious.  As digital assets are published, people will lunge at the opportunity to self tag every digital asset both self and community shared assets.  Tagging on social media platforms is already going ahead.   Taxonomies are built, maintained and shared across social media platforms.  Systematically tagged [inanimate] objects occur using object recognition. Shared, and maintained global taxonomies not only store data on people and their associated meta data, (e,g,  shoe size, education level completed, HS photo,etc.) but also store meta data about groups of people, relationships and their tagged object data.

The taxonomies are analyzed and correlated, providing better, more concise demographic profiles.  These profiles can be used for 

  • Clinical trials data collection
  • Fast identification of potential outbreaks, used by the CDC
  • The creation and management of AI produced Hedge Funds
  • Solicitation of goods and services

Out of Compliance

These three dreaded words you are guaranteed to see more and more often.  As all aspects of our lives become meta data on a taxonomy tree, the analysis of information will make correlations which drive consumers and members of society ‘out of compliance’.  For example, pointers to your shared videos of you skydiving will get added to your personal taxonomy tree.  Your taxonomy tree will be available and mandatory to get life insurance from a tier 1 company.  Upon daily inspection of your tree by an insurance AI engine, a hazardous event was flagged. Notifications from your life insurance company reminding you ‘dangerous’ activities are not covered on your policy.  Two infractions may drive up your premiums.

Companies Turn Toward “Data Sifters” & “Data Banks” to Commoditize on ‘Smart Object’ Data

Anyone who is anti “Big Brother”, this may not be the article for you, in fact skip it. 🙂
In the not so distant future, “Data Sifter” companies consisting of Subject Matter Experts (SME) across all verticals,  may process your data feeds collected from ‘smart objects’.   Consumers will be encouraged to submit their Smart data to ‘data sifters’ who will offer incentives such as a reduction of insurance premiums.
Everything from activity trackers, home automation, to vehicular automation data may be captured and aggregated.    The data collected can then be sliced and diced to provide macro and micro views of the information.    On the abstract, or macro level the information may allow for demographic, statistical correlations, which may contribute to corporate strategy.
On a granular view, the data will provide “data sifters” the opportunity to sift through ‘smart’ object data to perform analysis, and correlations that lead to actionable information.
Is it secure?  Do you care if a hacker steals your weight loss information?  In fact, you might feel more nervous if only the intended parties are allowed to collect the information. Collected ‘Smart Object’ data enables SMEs to correlate the data into:
  • Canned, ‘intelligent’ reports targeted to specific subject matter, or across silos of data
  • ‘Universes’ (i.e.  Business Objects) of data that may be ‘mined’ by consumer approved, ‘trusted’ third party companies, e.g. your insurance companies.
  • Actionable information based on AI subject matter rules engines

Consumers, people, may have the option of sharing their personal data with specific companies  by proxy, through a ‘data bank’ down to the data point collected   The sharing of personal data or information:

  1. may lower [or raise] your insurance premiums
  2. provide discounts on preventive health care products and services, e.g. vitamins to yoga classes
  3. Targeted, affordable,  medicine that may redirect the choice of the doctor to an alternate.  The MD would be contacted to validate the alternate.

The ‘smart object’ data collected may be harnessed by thousands of affinity groups to provide very discrete products and services.  The power of this collected ‘smart data’ and correlated information stretches beyond any consumer relationship experienced today.

At some point, health insurance companies may require you to wear a tracker to increase or slash premiums.  Auto Insurance companies may offer discounts for access to car smart data to make sure suggested maintenance guidelines for service are met.

You may approve your “data bank” to give access to specific soliciting government agencies or private research firms looking to analyze data for their studies. You may qualify based on the demographic, abstracted data points collected.   Incentives provided may be tax credits, or paying studies.

‘Smart Object’ Adoption and Affordability

If ‘Smart Objects’, Internet of Things (IoT) enabled, are cost inhibiting.  here are a few ways to increase their adoption:
  1.  [US] tax coupons to enable the buyer, at the time of purchase, to save money.  For example, a 100 USD discount applied at the time of purchase of an Activity Tracker, with the stipulation that you may agree,  at some point, to participate in a study.
  2. Government subsidies: the cost of ‘Smart Objects’ through annual tax deductions.  Today, tax incentives may allow you to purchase a ‘Smart Object’ if the cost is an itemized medical tax deduction, such as an Activity Tracker that monitors your heart rate, if your medical condition requires it.
  3. Auto, Life, Homeowners, and Health policy holders may qualify for additional insurance deductions
  4. Affinity branded ‘Smart Objects’ , such as American Lung Association may sell a logo branded Activity Tracker.  People may sponsor the owner of the tracking pedometer to raise funds for the cause.
The World Bank has a repository of data, World DataBank, which seems to store a large depth of information:
World Bank Open Data: free and open access to data about development in countries around the globe.”
Here is the article that inspired me to write this article:
Smart Object Data Ecosystem
Smart Object Data Ecosystem

Business Intelligence, Analogies, and Articulation of Data on Mediums

As I was reading the article from the New York Times, As Boom Lures App Creators, Tough Part Is Making a Living, the typical doom and gloom story about the get rich quick with creation of applications on Tablets is true of any start-up company, may it be a restaurant, clothing shop, or other.  You have idea, Sally has an idea, and so does Fred, and the likely hood everyone will be elated about every bar, restaurant, clothing store or application is ridiculous.   Simple economics, and opportunity cost, you cannot go to every restaurant in parallel every night.  One USD trades off an opportunity to spend it somewhere else.  One area I would suspect has massive opportunities in the coming weeks, months, and years is Business Intelligence, Analogies, and Articulation of Data on a Tablet medium.  Yes, it is true, there are established players in the marketplace, but being established also makes you less nimble for change.  Being able to look at a clients Data Warehouse, and create mediums for analogies expressing where there customers have been spending their money, why, and help predict trends in a KISS fashion to any level of a business organization is key.  That is why the innate talents of user interface, user interface engineering, or way back it was called industrial design.  In short, part of the appetite for corporate spending will always come from how do I make more money with the product I just bought, Return on Investment (ROI).  Business Intelligence is one area I have been studying for years, and as all people know, we all find it difficult to express, or analogize thoughts, and specifically, dive into ‘data’ and turn it into information a CEO, or business analyst can understand and turn that ‘information’ into a new marketing campaign, hence, business intelligence.  Until we can all read minds, and transfer like for like information, BI, and improving upon this space will be an area to derive income.

[dfads params=’groups=1177,1178&limit=1&orderby=random’]

WordPress Shortcode API to Cloud Storage to Sell Any Digital Intellectual Property.

So, I was a browsing, going through bills, and thinking, hey relating to my other article on Google Docs and their new API where you could use them as a data warehouse, it occurred to me.   Why can’t we have a public API for all the Cloud Storage systems like Amazon Web Services (AWS) S3 (or Box.com), create a plugin to WordPress, add E-Commerce, and you now have your own place to sell digital music, or any Digital intellectual, property store, or host your own database OLTP or OLAP.

And my bro, Fat Panda, might have been thinking the same thing.  He’s one step behind, but he will catch on.  I will try to update for ‘the cheap seats’ in a bit.

For the cheap seats, even those static files stored up in the cloud, you can use a similar model to Google Docs <-> Google Fusion where you add tabular data to storage, read,over-write, or update using home made table locking mechanism, and essentially use the cloud as a data warehouse, or even a database.  Microsoft seems to have a lead on transitional and analytical storage with Microsoft Azure, relational in nature in the cloud, but it is so much simpler than that with cloud storage, although if not implemented with ‘row’ locking,there is an issue with OLTP (On Line Transaction Processing) row level, high volume, but with OLAP, On Line Analytic Processing, not so much, analyzing the way your business does business, and profit more from your consumer data.  There are easy ways to implement row level locking for row level locking of tabular data stored in cloud storage like AWS or Box.Net,  The methods to implement row level locking for OLTP systems using storage in the cloud are easy to implement, and will remind you of old school type alternatives to supplement the AutoNumber columns in MS Access or Identity columns in SQL Server. At the end of the day to either sell digital intellectual property from a WordPress implementation, or run your entire business with a robust cloud database solution for OLTP or OLAP systems using flat file storage!  Why go through all this when the Amazons AWS and Microsoft Azure have or will yearn to start building these solutions in parallel?  Cost effective solutions, and the entire database arena monopolized by Oracle, IBM, Microsoft, and MySQL, just got extended to a whole lot of database vendors.  It may take a while, but we already know the big Gorilla in the room Google is the first to strike in this game, as a non-traditional database vendor, cloud storage provider with their updated Google Docs API, and optionally usage of their Fusion application.

Tablet Developers Make Business Intelligence Tools using Google as a Data Warehouse: Completing with Oracle, IBM, and Microsoft SQL Server

And, he shoots, and scores.  I called it, sort of.  Google came out of the closet today as a data warehouse vendor, at least they need a community of developers to connect the dots to help build an amazing Business Intelligence suite.

Google came out with a Google Docs API today, which using languages from Objective-C (iOS), C#, to Java so you can use Google as your Data Warehouse for any size business. All you need to do is write an ETL program which uploads and downloads tables from your local database to Google Docs, and you create your own Business Intelligence User Interface for the creation and viewing of Charts & Graphs.  It looks like they’ve changed strategies, or this was the plan all along.

Initially I thought that Google Fusion was going to be the table editing tool to manipulate your data that was transferred from your transactional database using the Google Docs API.  Today they released a Google Docs API and developers can create their own ETL drivers and a Business Intelligence User Interface that can run on any platform from an Android Tablet, iPad, or Windows Tablet.

A few days ago, I wrote the article, which looked like they were going to use a tool called Google Fusion, which was in Beta at the time to manipulate tabular data, and eventually extend it to create common BI components, such as graphs, charts, edit tables, etc.

A few gotchas: Google Docs on Apple iPad is version 1.1.1 released 9/28/12, so we are talking very early days, and the Google Docs API was released today.   I would imagine since you can also use C#, someone can make a Windows application on the desktop to manipulate the data tables, create and view graphs, so a Windows Tablet can be used.  The API also has Java compatibility, so from any Unix box, or any platform, Java is write once, run anywhere, wherever your transitional database lives, a developer is able to write a driver to transfer the data to Google Docs dynamically, and then use Google Docs API for Business Intelligence.  You can even write an ETL driver which all it does is rapidly transfer data, like an ODBC, or JDBC driver and use any business intelligence tools you have on your desktop, or a nightly ETL.  However, I can see developers creating business intelligence tools on Android, iPad, or Windows tables to modify tables, create and view charts, etc., using custom BI tool sets and their data warehouse now becomes Google Docs.

Please reference an article I wrote a few days back, “Google is Going to be the Next Public and Private Data Warehouse“.

At that time, Google Fusion was marked as Beta on 10/13/2012.  Google has since stripped off the word Beta, but doesn’t matter.  Its even better with the Google API to Google Docs.  Google Fusion could be your starter User Interface, however, if your Android, iOS (Apple iPad), and Windows developers really embrace this API, all of the big database companies like IBM, Oracle, and Microsoft may have their market share eroded to some extent, if not a great extent.

Update 10/19:

Hey Gs (Guys and Gals), I forgot to mention, you can also make your own video or music streaming applications perhaps, using the basic calls of get and receive file other companies are already doing such as AWS, Box, etc. It’s a simple get / send API, so not sure if it’s applicable to ‘streaming’ at this stage, just another storage location in the ‘cloud’, which would be quite boring.  Although thinking of it now, aren’t all the put / send cloud solutions potential data warehouses using ETL and the APIs discussed and published above?  Also, it’s ironic that Google would also be competing with itself, if it was a file share, ‘stream’ videos, and YouTube?

PostgreSQL / local database and SOA, mid tier for cloud solutions to improve performance

In an article I read from the NY Times, Salesforce.com may be making a play to banish Oracle as a supported platform. However, the system which might be interesting would be a PostgreSQL, or in memory database, acts as a local cache for the transaction based system, clears the local database records/cache after it uploads the ‘staged’ data from the local database to a cloud database where the data is ultimately stored. The activities on the local database should be fast, and the cloud database is a) data that may be transformed to any cloud based solution vendor(s), if necessary, if an SOA is built on top of the local database which communicates with the cloud via APIs. b) enables a local data-mart, if not transferred in real time, i.e. use a nightly transformation and have access to “day of” BI on a limited set of local data, c) again transaction performance and data segregation of the warehouse. This architecture is already in use at many firms, but I wanted to call it out. Another option is to use two cloud database solutions, one ‘local’ to your region, and one globally dispersed for performance and redundancy using an ETL, although I am not convinced this would be a great architecture.  The second cloud tier can be a transformation from the first for regulatory archiving, if required by law either for finance or DR (Disaster Recovery) policy.

Google is Going to be the Next Public and Private Data Warehouse

In an article I wrote a while back, Google to venture into Cloud, provide Open Source APIs, assist small businesses to be Cloud Solutions Integrators, I was talking in the abstract, but I saw on the Google site, buried way down their menus, under the ‘More’, and then select the ‘Even More’ option, and at the bottom left of the page you will see Innovation, Fusion Tables (Beta).  Google is advanced, ready to compete with the database vendors, with a user friendly UI, better than I thought.  They are currently providing a way to upload data to a Google Drive, then the user imports the data from the Google Drive, and using table views  and Business Intelligence tools, allows the user to manipulate and share the data.  The data allowed to be uploaded into tables seems limitless. Although, they state Google is still in Beta, and publicly are showing users can upload and link to Google data instead of allowing users to connect to external data sources, such as your sales transaction database, there may be an API in the works for 3rd parties to allow for integration using direct connections through drivers such as ODBC or a JDBC driver to integrate with transactional systems to stream data and not just uploaded Google data.  However, this may be their strategy, to host all of the data, and have a migration utility.  At this stage, they would like to house the data and have the cloud storage infrastructure, however, the strategic mid-term goal may be to allow you to house your RDBMS transaction data locally, and we could stream, and/or upload into their data warehouse to apply Business Intelligence to manipulate the data, and then publish it in multiple formats, e.g. they would display the data for public or private consumption, and I can also see you are able to then publish charts with commentary into your Google Plus stream with specific ‘Circles’.  Brilliant.  Hat’s off to you guys.  If Google allows streaming of the data, or what we call data transformations from your e.g. sales transaction system to the Google data warehouse, then they would be competing with IBM, Oracle, and Microsoft.

Update: 12/26/12
After all of that profound scoping, and keen insight, I was chatted by a developer that Google’s BigQuery does the job better.  I am curious why it has not taken off in the Marketplace?  Anti-Trust?  Also, why then create an abstraction layer like these other products like Fusion and call out explicitly Google Docs, maybe that would help them transition into the market space with a different level of user the consumer, or the target user would be different, such as the small business.
[dfads params=’groups=1177,1178&limit=1&orderby=random’]

Google to venture into Cloud, provide Open Source APIs, assist small businesses to be Cloud Solutions integrators

I am quite familiar with the Oracle and HP approaches of ‘owning the cloud stack’ and the Amazon Web Services approach which caters to all classes, cloud modules are interchangeable, and the arguments for both sides of the coin, one stop shop as a service, top tier clients verse the open APIs, partnering and interchangeability or plug in play cloud partner models.  I challenge you to look beyond that and see a vast network of fiber, high speed computing available across the globe, from small and large companies over cloud solutions from infrastructure to data warehousing.

The question we should all be asking of small to mid size companies, who will be the solutions provider who pulls this all together?  The solutions integrator, just like we saw in the 80s and early 90s when Banyan Vines, and Novell were sprouting, and we witnessed Microsoft out of the blue take hold of the Networking space.  Are the Amazon Web Services the beginnings of an Open Sourced API Model?  Will VMWare or Red Hat sprout up Cloud Farms?  Will whomever owns the fiber lines, as in the Google Experiments in Kansas, will venture into the Cloud Business Model, provide Open Sourced APIs to small to mid sized businesses to be the small to mid sized solutions integrator of tomorrow?

Big Data Creates Opportunities for Small to Midsize Retail Vendors through Collective Affinity, Consumer Alignment

In the Harvard Business Review, there is an article, Will Big Data Kill All but the Biggest Retailers?  One idea to mitigate that risk is to create a collective of independent retailers under affinity programs, such as charities, and offer customers every N part of their purchase applies to the charity to reach specific goals as defined by the consumer.   Merchants, as part of this program decide their own caps, or monetary participation levels.  Consumers belong to an affinity group, but it’s not limited to a particular credit card.  The key is this transaction data is available to all participating merchants for the affinity.  Transaction data spans all merchants within the affinity and not just the transactions executed with the merchant.

Using trusted, independent marketing data warehouses independent retail vendors share ‘big data’ to enable them to compete and utilize the same pool of consumer [habitual] spending data.

Affinity, marketing data companies can empower their independent vendors with the tools for Business Intelligence and pull from the collective of consumer data.  Trusted, independent marketing data warehouses sprout up to collect consumer data and enable it’s retail vendor clients to mine the data.

These trusted loyalty affinity data warehouses, not affiliated with financial institutions, as previously implemented with credit cards, but more in line with, or analogous to supermarket style loyalty programs, however, all independent retail vendors may participate, OR may cap these affinity program memberships for retail vendor from small to mid size companies.

Note: Data obfuscation would be applied so customer identification on fields like social security number will not be transparent, limiting any liabilities for fraud.

[dfads params=’groups=1177,1178&limit=1&orderby=random’]