AI-Driven Business Process Automation in JDE

 

Written by Brett Falck

Over the past year and a half, ChatGPT has taken the world by storm. Within a mere two months of its release, it amassed a staggering 100 million users. As consumers worldwide embrace this groundbreaking technology, it is evident that nearly every user has found ways to enhance aspects of their lives. However, ChatGPT’s utility extends far beyond consumer applications. Across industries, it has the potential to act as a formidable force multiplier, amplifying employee output while simultaneously eliminating menial tasks. Particularly in the realm of ERPs, ChatGPT, along with its AI contemporaries, promises to revolutionize and streamline the user experience.

Before diving too deep, an explanation of how ChatGPT works is in order. ChatGPT is a type of AI known as large language models (LLM). LLMs are, furthermore, a subset of deep learning models. In essence, an LLM is a model that has been trained on vast swaths of textual data to both understand text and generate new text. This understanding, combined with its generation capabilities, is what gives ChatGPT its power.

Leveraging such textual power within Enterprise Resource Planning paints an exciting future. Consider the standard use case of a manufacturing setup: procurement managers often need to navigate complex interfaces within an ERP, while new employees are daunted by such intricacies. By integrating a textual interface into an ERP, such challenges can be mitigated. New users can interact with the textual interface to onboard and learn more about the ERP by asking such questions as “How do I create a purchase order?” Through natural dialogue, the LLM can gradually onboard the user. With a few further tweaks and API integrations, this dialogue could be expanded into triggering and performing ERP automations on the user’s behalf, saving the need to navigate those prior complex menus. This transforms the ERP experience from a wrestling match with software into a streamlined conversation, enhancing efficiency across the board.

This use case is what J. Geiger Consulting recently had the opportunity to build a demo around. We chose to build a chatbot within Microsoft’s Power Virtual Agents due to the native integration with both Power Automate and LLMs like ChatGPT. To enable the educational aspect from prior, we utilized the Boost Conversations feature of Power Virtual Agents in conjunction with a custom web search pointed at several JD Edwards Enterprise One documentation sources. Whenever the bot received a question about JDE, it would perform a web search and then pass the results into an LLM through the Boost Conversations feature. From there, the LLM would then craft an answer to the user’s query while providing the sources for its answer.

Recognizing that even more functionalities could be presented to the user, we integrated the chatbot with a JDE instance to allow for the triggering of automations via the user’s prompts. This was done by prompting the user for data, then passing said data from Power Virtual Agents through Power Automate to JDE via API calls. Through this, a user could ask a query such as “How do I create a purchase order?”, the chatbot would dynamically generate an answer for them, and it would further ask if they would like help creating a purchase order. If the user responded yes, then the chatbot would prompt for the necessary order details and trigger the JDE automation. On the other hand, if the user wanted to skip the documentation, they could directly ask to create a purchase order, and the bot would automatically detect the imperative statement, jumping straight to the order.

The integration between the chatbot and JDE was well received in the user community. At the PackerLand User Group (PLUG) summer meeting, the functionality was demonstrated to all in attendance and sparked a lot of good discussion around use cases at the companies represented. There was interest in using the chatbot for internal documentation, particularly around onboarding and benefits. In addition, there was interest in utilizing the JDE integration to allow those doing off-hours support to be able to launch functionality from a phone or laptop without having to be logged into the JD Edwards instance directly. Overall, it was an extremely well-received response and PLUG as well as some demonstrations of the functionality at the INFOCUS conference.

The capabilities seen so far with LLMs such as ChatGPT are only the tip of the iceberg. We are at the start of a new era of technology that will revolutionize both the consumer and business experience. Already, new features are being incorporated every quarter by LLM producers. Information security, image recognition, image generation, code generation, and data analytics are just a few of the many things on the horizon or already released. If LLMs like ChatGPT are something that excites you and you want to partner with us on this journey, please reach out to set up a complimentary conversation around how J.Geiger Consulting can help you achieve your ideal outcome with JD Edwards and AI.

Guest User