RESOURCES
We believe that a rising tide raises all ships, so we work hard to share knowledge that educates and informs. We want to lead a better future for the industries we serve, making facilities and their operations more efficient and safe.
ProLytX Tech, LLC Completes Separation from Contech Control Services, Inc., Emerges as Independent Entity
READ MOREExploring the Reality of AI in Business: A Practical Approach
Artificial Intelligence (AI) has permeated virtually every industry, touted as the catalyst for technological advancement. Yet, amidst the buzz, one fundamental question remains: what does AI look like in day-to-day operations? Let’s delve into the tangible applications of AI, away from the glitzy marketing narratives, to explore its genuine implications for businesses.
At ProLytX, a company deeply entrenched in the complexities of plant engineering, the AI revolution sparks curiosity and skepticism. It seems like everybody suddenly has AI because major technology players like Google and Microsoft provide the technology. But the sudden ubiquity of AI solutions caught many off guard. For us, the challenge isn’t merely about adopting AI but understanding its practical relevance to our work. We’ve been watching the technology for a while, waiting to figure out how it could impact office life, but it has had no significant impact on our day-to-day work until recently.
Our industry often grapples with outdated systems and heaps of legacy documents, hindering their transition to digital platforms. Everyone’s looking to build a digital version of their plant, but the question everyone runs into is, how do I get all of my information into that if I have a filing cabinet full of old blueprints?
This dilemma led us to embrace AI not as a solution but as a tool to augment human capabilities. We approach using AI this way in five distinct phases, each aimed at bridging the gap between antiquated systems and modern digital solutions.
Phase One: Harnessing Commercial AI Tools
Our initial look into AI involved leveraging existing tools like Microsoft Azure’s Intelligent Document Processing (IDP). This technology enabled us to analyze documents swiftly, extracting vital information with minimal manual intervention. It’s about capturing as much value as possible out of the box.
Phase Two: Intelligent Document Classification
With thousands of documents at hand, the next challenge was organizing them effectively. Here, our AI-driven classification system proved invaluable, categorizing documents based on structure and content. We’re talking about auto-classifying documents to avoid tedious manual sorting.
Phase Three: Training AI for Data Extraction
While commercial AI offerings provided a solid foundation, we realized the need for a more tailored approach to data extraction. Our solution involves creating a user-friendly interface that allows subject matter experts to refine AI-generated results. We’re transitioning from relying solely on AI to empowering human expertise.
Phase Four: Extraction and Quality Assurance
As data extraction proceeded, meticulous quality assurance became paramount. Our team meticulously reviews AI-generated outputs, identifying errors and anomalies to ensure data accuracy. It’s about maintaining a delicate balance between automation and human oversight.
Phase Five: Integration into Digital Systems
The final phase entailed seamlessly integrating extracted data into clients’ digital ecosystems. This involved mapping data to relevant fields and migrating it to designated platforms, facilitating a smooth transition to digital twins. It’s about ensuring extracted data finds its rightful place in the digital realm.
Through this iterative process, we witnessed AI’s tangible impact on our workflow. From expediting document analysis to enhancing data accuracy, AI has emerged as a formidable ally in our quest for digital transformation.
However, our journey with AI is far from over. As technology evolves, so must our strategies and tools. We’re continually evaluating our workflow and the tools available. We aim to stay at the forefront of AI development, harnessing its potential to drive meaningful change in our industry.
In the ever-evolving landscape of AI, one thing remains certain: its true power lies not in grandiose promises but in its practical applications. At ProLytX, we’re committed to unlocking AI’s full potential, one document at a time.
The Inherent Challenges of Digital Twins and Advanced Analytics
As a baseline, owners of industrial facilities require comprehensive data throughout the lifecycle of their plants. However, the industry-wide shift towards data-centricity, digital twin, and advanced data analytics requires more than merely a technological evolution but a foundational change. The advanced analytics that owners desire are not possible with deficiencies in foundational data. The conventional model, relying on Engineering, Procurement, and Construction (EPC) entities for data management for operational facilities, presents an inherent conflict as data ownership and management are in the EPC’s hands, but there are often multiple EPCs working on a given project. As a result, the industry grapples with the issue of data silos, where valuable information remains underutilized and frequently inaccessible. While larger EPCs have begun adopting data-centric applications, there is still a gap with smaller entities who have been slower in making this shift. System agnostic, third-party service providers, like ProLytX, can facilitate the utilization of engineering data from multiple EPCs or that sitting in antiquated document management systems to help owners leverage it for the operations and maintenance of their facilities transparently and efficiently.
The prevailing practice of putting equipment data in an asset management system and engineering drawings/lists in a document management system leaves owners struggling with data availability and trustworthiness. Here are a few things to consider:
- Asset management systems house only a fraction of the overall equipment data. This is by design, as the consumers should reference the engineering data systems to get the full background for the asset. However, these systems are rarely well synchronized.
- Document-centric approaches result in duplicated and fragmented information that is difficult to manage. You can scrape and hotspot documents in an attempt to make them more intelligent, but that has limitations.
- Building a solid data foundation is crucial for unlocking the potential of the digital twin, but it is no easy task. Data-centric engineering applications (3D models, smart drawings driven by a database, cloud-based engineering tools, etc.) are the key to this foundation. These applications require a collaborative effort from administrators, developers, engineers, and designers.
Technology providers are doing their part to address the longstanding gap and steering the industry to a more progressive data-centric paradigm. Intelligent design and engineering systems like Hexagon Smart Instrumentation (formerly INTools) and Smart P&ID emphasize housing and utilizing the data in its authoring system rather than perpetuating document-centric practices.
Essentially, the conflicts inherent in traditional data management models are being addressed as owners and EPC entities transition to more intelligent design systems. ProLytX is trying to empower the industry with comprehensive data and steer it towards a more data-centric and analytically-driven future as we are committed to helping companies achieve the digital twins and advanced analytics they desire.
The Origin of Engineering Technology
In the dynamic realm of engineering technology, the landscape has transformed significantly over the years. In a recent podcast featuring Blake and Mike, we got an insider’s look at the evolution of tech in the industry, shedding light on the shifts from traditional paper-based processes to the cloud-centric approaches we see today.
Back in the analog era, engineers wielded pens and rulers, drawing intricate plans on drafting tables. Then came the digital revolution in the ’80s, with the introduction of AutoCAD and the widespread adoption of personal computers. The transition marked a pivotal moment as the industry embraced newfound efficiency.
Venture into the early 2000s, and the intersection of engineering and IT took center stage. In this technological tug-of-war, servers and software licenses became integral components of the landscape. Bridging the gap between these two worlds were the unsung heroes — Engineering Technologists. Blake and Mike emphasized their crucial role in translating the language of IT for engineers and vice versa, facilitating smoother collaboration.
The global project era does not come without challenges. As teams spanned the globe, technologies like Citrix facilitated remote collaboration, but not without hurdles. Enter Active Directory, adding another layer of complexity to the tech ecosystem.
Looking to the future, the landscape presents new challenges. Cloud technology, integration complexities, and the intricate dance of APIs are on the horizon. The key to success will undoubtably be a strategic focus on AI, customized solutions, and a commitment to staying at the forefront of technological advancements in the ever-changing landscape of engineering technology.
In summary, the journey from traditional paper-based methods to the cloud era is a tale of innovation, the vital role of engineering technologists, and the steadfast support of industry leaders like ProLytX. ProLytX specializes in optimizing engineering applications, managing digital twins, and ensuring seamless operation of industrial systems. As we navigate the future of engineering tech, ProLytX plays a pivotal role in navigating the intricate technological landscape, acting as the guide for organizations looking to stay ahead in the evolving world of engineering tech and leading through the exciting possibilities that lie ahead.
The Truth About the “Easy Button”:
How to Accurately Migrate Massive Amounts of Engineering Data in Minutes
Data is an asset, and operators know that digitizing and moving legacy engineering data into company-managed platforms and maintaining quality data-centric systems in-house is necessary to ensure competitive advantage and mitigate risks. However, digital transformation can be a tedious and arduous process, not for the faint of heart. The problem is that legacy engineering data is typically not structured in neatly labeled columns and rows or rich with metatags as it is in many other domains. Instead, engineering data can be in the form of PDFs, CAD files, Excel workbooks, Word documents, napkins, and other ambiguous formats. In most cases, these unstructured documents or renderings have little or no metadata at all.
To add to the confusion, not only does legacy data come in various file types, but even within the same file type, there could be numerous formats for the same asset. Consider what happens when digitizing data for current assets, such as control valves. A facility could have hundreds of control valves installed within the facility by several different vendors or contractors, each with their unique way of classifying and labeling the specifics of the valve. This leads to the same data point for each valve being labeled differently across the specification documentation. One may call a data point “type,” the other may call the same data point “material,” and yet another might call it “metal.” Before this data can be migrated to digitize the facility, it must be formed, mapped, and rationalized.
There are traditionally two approaches to solving data rationalization: manual and automated. Unfortunately, these approaches have their flaws in the outcome and the way progress is measured. Meaning, metrics are set to assess how quickly the data moves from one system to another with no appreciation for the time it takes to properly tag and map the data.
First is the laborious and costly approach of hiring an engineering company to rationalize and migrate the data manually. Of the two traditional methods, this results in the highest data integrity. However, this approach is slow, and the interruption to business can be drawn-out and frustrating taking months to complete. Additionally, because this is a lengthy manual effort with highly skilled resources, it is often cost-prohibitive.
The second is an automated approach. Automation can rely on A.I. to rationalize and migrate the data. Unfortunately, automatic data migration is often done by well-meaning software vendors and I.T. consulting companies not trained on the uniqueness of engineering data, who will naively oversimplify the amount of effort to digitize legacy engineering, facilities, and asset data. They have little understanding of the intricacies of engineering data and believe that applying the same approach as they would for other domains would work. They will say, “it is easy, we will put it in a data lake, then A.I. and Big Data programs will do the work.” But is that true? Technically, yes, the data may be there, but can you find it upon search? And, in the cases of documents and drawings without metadata, the difference between putting them in a data lake and a black hole is negligible. Additionally, the lack of engineering-specific knowledge prevents these vendors from being able to see issues upon review. The outcome is low data integrity resulting in low trust and increased risk. The cost is lower, but the resulting clean-up of such an approach is expensive, with overall costs exceeding that of the more laborious manual approach.
There is another approach that marries the two traditional methods. It utilizes technology to automate the transfer, but engineers to rationalize data and map the outcomes. The result is a migration with high data integrity at a cost far lower than the traditional manual approach, with brief or no business interruption. The secret is in the upfront mapping and rationalization. By assessing each of the data formats and intelligently tagging and mapping each to the new digital environment with sample testing along the way, the result is quality data that is trustworthy.
The pressure to measure success by how quickly the migration starts and judging the progress of the migration and not rationalization are mistaken. Patience is key. The adage, “measure twice, cut once,” applies. Invest the time in intelligently mapping the data, and the migration should be as simple as hitting an “easy button.” We recently moved over 5.6 million cells in less than 18 minutes for a client, but it took three months to map and verify. The client is not only thrilled with the high quality of data now available to its engineers and key stakeholders, but the project came in under budget and sooner than promised.
Data migration indeed can be as easy as hitting the “easy button.” ProLytX is an Engineering I.T. firm based in Houston, TX, and is a leader in this field, coaching clients to success with a unique combination of engineering and I.T. skills. If you want to learn more about ProLytX and how we can help you bridge the gap between I.T. and Engineering, find us at www.prolytx.com.
Say Goodbye to Proof Testing
One of the biggest and most costly challenges of plant maintenance is proving that systems are functional and safe. Proof testing is one of the most critical aspects of plant operations and costs millions of dollars in lost production every turnaround in the name of safety and compliance.
Even though we have come a long way in how we monitor the operations of facilities, documentation, reporting, and analytics remain problematic. This is because information lies in disparate systems. Instrumentation data from the SIS and DCS are not easily accessible and are typically low-resolution by the time they hit the DMZ or business LAN; as a result, valuable trip information is left uncaptured or hidden, and the entire proof testing cycle continues. As a result, companies are currently relying on manual testing and teams of technicians to manage this in the critical path. With digital transformation and an analytics engine to drive analysis and better reporting, plants can take advantage of trip and shutdown information to reduce test crews and streamline offline testing.
A general lack of visibility means that the entire proof test cycle must run its course when it comes time for routine maintenance. Shutdowns that just occurred must be simulated again from the field for validation and sign-off purposes. If there were better analysis and reporting, the testing could be optimized, thereby reducing the critical path and saving hundreds of thousands, if not millions, of dollars every turnaround. ProLytX has partnered with industry majors to tackle this problem.
ProSMART (System Monitoring & Analytics in Real-Time) software can understand how an SIS/DCS is configured and perform high-resolution data capture and real-time analytics and reporting. This disruptive technology is transforming the way plants look at and schedule proof testing and even instrument maintenance. ProSMART removes work scope and time-consuming tests during turnarounds by automating manual reporting tasks and data collection. It also provides a dashboard for leaders to manage risk, approve reports, schedule testing, and identify bad actors.
Benefits include:
- Real-time compliance with IEC61511
- Automated reporting and identification of successes / failures
- Reduction of costly testing during turnarounds
- Improved real-time risk management
- Cross-platform communication spanning a majority of SIS/DCS platforms
Recently, a facility using ProSMART identified $1.5MM in turnaround cost savings due to a reduction in SIS Proof Test requirements and 3rd party technician crews. During the plant shutdown, the operator captured enough data to eliminate over 20% of the planned maintenance testing scope. Automated reports were validated by 3rd party auditors and found to be more than sufficient. Accurate valve stroke time data and instrumentation bad actors were captured, which also allowed for targeted maintenance activities.
ProLytX is an Engineering I.T. firm based in Houston, TX, and is a leader in this field, coaching clients to success with a unique combination of engineering and I.T. skills. If you want to learn more about ProLytX and how we can help you bridge the gap between I.T. and Engineering, find us at www.prolytx.com.
Unrestricted Risk
When an operator considers cybersecurity, they tend to focus on Legal, HR, and Finance data. This makes sense; intellectual property and personal information security are essential for competitive advantage, and Dodd-Frank, SOC, and FTC regulatory compliance also come into play. But, what about facilities and engineering data? As the hosting of major Capex projects shifts from contractor environments to owner-managed environments, the access required to collaborate with third parties is already accessible through engineering applications; so, what can be gained from securing it? The better question is, what can be lost by not?
A while back, in a meeting with a client, we were discussing the vulnerabilities of remote access to engineering solutions. The client was sure that their engineering applications were 100% secure. To prove it, he provided a guest log-in to their engineering applications hosted environment during the meeting. Within minutes, our team had access to one of their most widely used application’s backend data and environment details. A few moments longer, and we had made it on their internal business network. They were stunned. This could have been a significant safety risk had these applications been hosted in part on their PSN.
So why is no one looking at this type of access as a security breach? For one, the objectives are mostly honest; the perpetrator’s goal is not typically to sabotage or steal, but rather for efficiency to their projects. After all, the data is project information and made available in the engineering application anyway. However, it is not the access to the data that is a concern; it is how the data is accessed. Problems occur when the user bypasses the engineering applications to gain entry into the backend databases. This access to the database can allow contractors to streamline their workflows or practices. This is good, maybe? But don’t let these benign intentions distract from the inherent risks.
There are several potential problems with failing to address security in these engineering application-hosted environments.
Direct and unauthorized access to the database undermines the safeguards built-in to application workflows, by-passes approval processes, and activity tracking.
- There is no way to guarantee that everyone is operating with the best intentions.
- There is no way for a user with knowledge of only one project to fully understand how data is being used throughout the facility on other projects.
- Changes to the underlying data can be disastrous and expensive even when the intentions are virtuous.
With today’s remote work environments and large data projects, it is common for operators to work with several contractors across multiple projects at any given time: all of them accessing the same information. The impact of changing a single data point could have devastating and costly repercussions to others, like a catastrophic failure or tens of thousands of man-hours to fix an issue. Breaches don’t have to be terroristic or competitively motivated to be threatening. They generally come from engineers merely trying to simplify their work by venturing into a restricted area without a sign or lock on the door.
By preemptively implementing an engineering application-based security solution in hosted environments, companies can prevent unwanted access, maintain better data integrity, and minimize the risk of costly mistakes and the potential consequences that are far more costly than a security solution.
ProLytX is an Engineering IT firm based in Houston, TX, and is a leader in this field, coaching clients to success with a unique combination of engineering and IT skills. If you want to learn more about ProLytX and how we can help you bridge the gap between IT and Engineering, find us at www.prolytx.com.