Bot Tables: New RPA Automations Require New Data Stores
By Joe Labbé on March 16, 2020
Last week, we released a great case study that chronicled an RPA automation we built for ECi Defense Group. Prior to the automation’s implementation, ECi staffers manually downloaded government-generated solicitations for goods and services, reviewed them, and assigned them to specific clients for bids. The issues were as follows: the daily volume of solicitations was significant, the number of sources (websites, email, etc.), from which the solutions were acquired was many, and the solicitation documents did not clearly discern duplicate or updated solicitations. All these factors contributed to a lot of manual processing and high error rates.
While most folks look at this automation project and focus on the physical task of downloading the solicitations, I believe the most critically important part of the solution is what Jessica Baker Boehme described as the “history database.”
“When a new solicitation is found, the download is recorded in a history database for future searches to check against. The history database has added much more traceability because we know exactly when and from where each item was pulled. This has been extremely helpful when investigating any missed opportunities....”, said Baker Boehme. While automating the downloading process was a big time-saver, the creation and maintenance of the database by the RPA bots is the real gem of the project.
I draw your attention to this part of the solution because it emphasizes an aspect of RPA projects that is often overlooked: the need to create new data stores to facilitate the automation. Many automations are all about moving data between systems and automating data entry. And that’s great. However, there are also many automations requiring new data stores to be created to house supplemental information required to perform the automation that is not necessarily part of the actual transaction data set. In the KnowledgeLake platform, we call these purpose-built, RPA data stores “Bot Tables.”
A “Bot Table” is a structured, platform-enabled data source that RPA bots can use to house the following kinds of information:
- The automation’s actual transactional data preserved for logging and reporting purposes.
- Metadata about the transaction.
- Validation data required to help facilitate the transaction.
- Transient data used during the execution of a transaction but not persisted long-term (think automation clipboard).
- Data derived during the execution of a transaction that should be persisted long-term for historical purposes.
Considerations When Using Bot Tables
Use of Bot Tables should not be an afterthought and should be built into your RPA platform of choice. When using Bot Tables, you should consider the following:
- What information should be persisted and for how long?
- From a compliance perspective, are we allowed to persist this information? If so, how must it be secured?
- Should the platform periodically purge information as it ages?
- How is the Bot Table situated for disaster recovery?
- Besides the bots, who will have access to Bot Tables and do we need to create a CRUD operations interface so human counterparts can perform routine maintenance?
Here’s another great example of the use of Bot Tables. We have a client that takes orders for “free” promotional items on its website from hundreds of external sale reps. Since the items are free, sales reps often dole them out to prospects and customers with reckless abandon which costs the company hundreds of thousands of dollars per year. Unfortunately, the client’s e-commerce solution does not have a mechanism for tracking order quotas at the sales rep/item level. In this case, we created an RPA unattended bot that captures every order placed in real-time and verifies that the number of items being ordered by a given rep is within that rep’s order quota. In order to facilitate this process, we had to create a Bot Table that housed the rep/item quota matrix.
Now, as orders are placed, the bot captures the order, compares the lines items to the rep/item quota matrix, and determines if the line item should be approved. If approved, the bot decrements the existing quota by the amount ordered. If not approved (i.e. this order will exceed the quota), the line item is zeroed out and the rep is notified. Further, since these quotas change frequently, the system administrator needed an easy way to modify these quotas on an ad hoc basis. Without the rep/item quota matrix housed in a Bot Table, this order check could not be performed.
The bottom line is Bot Tables are a necessary component for RPA automation projects and should be built into the platform rather than implemented as a one-off accommodation for each project. Sure, you can implement your own data stores on a project-by-project basis, but having it provided within the platform further helps your RPA automations scale.
KnowledgeLake provides content management solutions that help busy organizations intelligently automate their most important document processes. Since 1999, we've created award-winning, Microsoft-centric solutions that have helped thousands of companies around the world focus on their mission rather than their mission-critical documents.
Latest from the KnowledgeLake Blog
KnowledgeLake Huron Update. The Future of Intelligent Document Processing.
Awards Based on Customer Feedback Tell the Real Story
Stay in Touch
Receive the latest blogs from KnowledgeLake in your inbox!