Posts Tagged "PLM"

More often PLM starts as a CAD/Design data vault for many companies, later evolving to a design data exchange platform .  Most successful companies are taking PLM beyond just a design data exchange and access control platform; to a knowledge driven decision support system.  This means PLM not only needs to manage the multitude of information generated at various stages of the product lifecycle , but also capture the product development knowledge and feed it back to the product lifecyccle. For example, the requirements and design for a newer version of a product  needs to be also driven by the knowledge elements captured from the previous version’s lefecycle, from inception to design to manufacturing and service.

When PLM stays just in the Design Engineering world, it’s constrained to exchange information and capture knowledge from downstream stages managed by disconnected, silo based systems. This results in engineers spending huge amount of time in data acquisition tasks. Industry studies shows that information workers spend 30-40% of their time only for information gathering and analysis, thus wasting time in searching for nonexistent information, failing to find existing information, validating the information or recreating information that can’t be found.

Quality escapes is another challenge with such disconnected systems when product doesn’t confirm with the engineering definition. Non-conformances found on the shop floor  are costly to review and dispose and even more severe when the product is already on service. Reconciling change is also extremely challenging, especially its downstream propagation, resulting in significant productivity losses. Slow change processing along with quality escapes cause delays in new product introduction affecting the overall ability of the companies to compete.

The first step towards transforming PLM to a true knowledge driven decision support system is to extend it to the CAD/CAM/CNC process chain, thus taking it to the shopfloors. Such a solution helps to establish a  continuous loop from Engineering into the shop floor for operations management and manufacturing execution systems (MES). Such a continuous loop system provide more ways to capture the business intelligence and then suggest solutions based on the previous patterns. Then it’s much easier to capture information and use analytics to synthesize valuable knowledge elements compared to the fragmented solutions many companies have today.  It’s also a foundational element for establishing a Digital Twin per Industry 4.0 vision

 

Other key benefits of extending PLM to manufacturing include

Reducing the time to market

  • Enhanced collaboration between Product and Manufacturing Engineering
  • Enhanced Traceability and Faster Change Management

Enhancing Flexibility

  • Manufacturing plans comprehend product variability/complexity
  • “What if” scenarios for optimized decision making

Increasing Quality

  • Manufacturing Simulation and validation integrated in PLM
  • Up-to-date 3D work instructions delivered to the shop floor

Increasing Efficiency

  • Ongoing process optimization based on Closed loop feedback of utilization data
  • Reuse of common methods/tooling

Robots, automation, and what some would even call “artificial intelligence” are everywhere around us today.  From factories to automobiles; our smart phones to our refrigerators; digital life today has had a profound impact over recent decades.  This doesn’t appear to be slowing down anytime soon either.

Automation and robots in factories have replaced many of the assembly line jobs, and this has also lead to a great improvement in quality and efficiency.  Instead of assembly line workers, we now have robot technicians and programmers.

The big question is “Why?”.  And the answer is generally along the lines of improved efficiency due the need to stay competitive.

It could be argued that we are seeing the same thing in the engineering environment with design automation and the associated engineering intelligence built into repeatable design types… or automated workflows that are electronically managed to ensure process repeatability, quality and efficiency.

This begs the question:  Can companies survive in today’s market without investing in their business and engineering processes like they also have or must do in manufacturing?

This is a further  followup to my previous articles on digital twins, focusing on the Feedback Loop pillar

Smart Factory loop
The feedback loop starts with the Smart Factory. This is a fully digitalized factory model of a production system connected via sensors, SCADA systems, PLCs or other automation devices to the main product lifecycle management (PLM) data repository. In the Smart Factory, all events on the physical shop floor during production are recorded and directly pushed back to the PLM system or through the cloud. Artificial intelligence (AI) technology is used to study and analyze this information, and the main findings are sent back to either product development
in manufacturing planning or facility planning.
Why is this important? Production facilities and the manufacturing processes tends to change immediately after start of production. New ideas will be implemented, new working methods will be deployed and new suppliers might be selected; all requiring changes to the production system or process. Since these modifications will certainly impact the future, updating them in the system at this stage is becoming a must. Production systems outlive the product lifecycle, and many companies use their production systems to make multiple products. These factors contribute to the increasing need to regularly capture these changes in the PLM system, which can later be used to distribute this information to all parties. The information collected during production can also serve as the basis for improving the maintainability of manufacturing resources. With this information, we can enable much better (sensor) condition-based maintenance, and thus increase uptime and productivity.

Smart product loop
Almost every product made today is a smart product. Many companies are looking for ways to improve the connection with their smart products while they are being used by their customers. Monitoring product use can provide a lot of knowledge for improving products. More than that, connecting to these smart products can generate a new type of business model that may result in more competitive offerings.

PLM Challenge

These feedback loops and the data it generate is a challenge for PLM too. In the short term, the PLM issue for digital twins is how IoT-gathered data can best be put to work—extrapolated, parsed, and redirected? To where? At whose direction? The quick and easy solutions are analytics running on the cloud, machine-to-machine (M2M), and analyses based on Artificial Intelligence (AI). Such questions are expected as digital twins emerge as the next revolution in both data management and lifecycle management.
Ultimately, the use of PLM will allow us to bring digital twins into close correspondence—in sync—with their physical equivalents in the real world. When this comes to pass, we can expect problems to be uncovered more quickly, products to be supported. Products with digital twins will be more reliable with less downtime while operating more efficiently and at lower cost. PLM-powered digital twins will boost user and owner confidence in their physical products. Ultimately, digital twins reflect what users and owners expect to receive when they sign a contract or purchase order.

Standing on the beach, overlooking the bountiful, yet imperfect, harvest, he pondered the situation in front of him. “Why are all of my troop mates eating these sand-covered sweet potatoes? In the beginning, they were delicious…and without the sand. Now? these wonderful treats are all but inedible. What if I…

This is the beginning of tale based on a scientific research project, though may have evolved into something of an urban legend. The idea is that scientists in Japan, circa 1952, were studying the behaviors of an island full of macaque monkeys. At first, the scientists gave the monkeys sweet potatoes. After a period of time, the scientists then started covering the sweet potatoes in sand to observe how they would react. Not surprisingly, the monkeys still ate the treats, however begrudgingly. Then, the story goes, a young monkey took the vegetable to the water and washed it off. He discovered that it tasted as good as it did before the sand. Excitedly the young monkey showed this discovery to his mother. Approvingly, his mother began washing hers in the water as well.

Still, the vast majority still went on, crunching away on their gritty meals. Over time, a few more monkeys caught on. It wasn’t until a magic number of monkeys were doing this – we’ll say the 100th – that seemingly the entire troop of monkeys began rinsing their sweet potatoes off in the water.

Call it what you will – social validation, the tipping point, the 100th monkey effect, etc. It all comes down the idea that we may not try something new, however potentially beneficial, until it’s “OK” to do so. Cloud solutions for PLM could be coming to that point.  These products have been in the market for a few years now, and they mature with every update (and no upgrade headaches, either).

In the near future, it is forecasted that “Within the next three years, organizations have the largest plans to move data storage/data management (43%) and business/data analytics (43%) to the cloud,” as reported by IDG Enterprise in their “2016 IDG Enterprise Cloud Computing Survey.”  Another survey, “2017 State of the Cloud Survey” by Rightscale, is seeing that overall challenges to adopting cloud services have declined. One of the most important matters, security, has fallen from 29% of respondents reporting it as a concern to 25%. Security is still a valid concern, though I think the market is starting to trust the cloud more and more.

With our experience and expertise with PLM solutions in the cloud, Tata Technologies can help you chose if, when, and how a cloud solution could be right for your company. Let us know how we can help.

My last post outlined on deriving more value out of PLM data through reports. The complexity of data in the engineering environment is skyrocketing, and Teamcenter as a PLM system provides advanced reporting capabilities for enterprise data, including the data managed in external systems like MRP & ERP.

The Teamcenter Report Builder application provides basic reporting capabilities for data managed inside Teamcenter. It supports two kinds of reports:

  1. Summary Reports
  • Reports that summarize similar information – forreportbuilder example, reports that show all the employees, all the items belonging to a user, or the release status of items
  • Context Independent reports – no object selection required
  • Generated from Teamcenter saved queries
  1. Item Reports
  • Reports that can be run on a particular object – for example, BOM or workflow information for a given object
  • Executed in the context of one or more objects

Behind the scenes, Report Builder uses Teamcenter queries based data dump and supports output to common formats like Excel, XML, Text and HTML.  It’s easy to build these simple reports based on Teamcenter queries, and they can be run from both rich client and Active Workspace client.  Excel can be further leveraged for complex processing, charting, and aggregation of the output.

tcraThe Teamcenter Reporting & Analytics module provides additional advanced reporting capabilities. It can summarize information and present data from many sources in a single report using an easy to build, configurable, drag and drop layout.

It can leverage standard formatting tools like headers/footers, dates, page numbers, report names, filters, tables, charts, and elements. Reports can be run from both Active Workspace or Teamcenter Reporting & Analytics client. It has business intelligence designed for Teamcenter and to understand the relationships and associations of PLM information.tcra1 It comes with over 100 out-of-the-box reports in areas like Change Management, BOM Reports, Substance compliance, Workflow, Administrator Reports, Verification Management, PMM, Schedule Manager, Requirements Manager. It supports powerful and fast BOM reporting, project planning and status reporting and dashboards, process and change reporting.

It has direct access to Teamcenter data through APIs and has connectors to standard enterprise applications. It can also enforce data security based on the Teamcenter access model.  Additional capabilities include:

  • Customized Analytics
    • Organization-specific process status metrics and KPIs
    • Multi-level root-cause analysis
    • Mean time between failure / to failure (MTBF, MTTF) analysis
    • Historical Performance Analysis
  • Reporting Control
    • Save Snapshots of pre-defined reports
    • Group/Role based Access to report data
    • User Controlled Conditional Formatting
  • Resource Management
    • Automated Report Scheduler and Delivery
    • Submit Analysis to queue for load management
    • Caching Techniques to reuse data cubes

Teamcenter Reporting & Analytics benefits include:

  • Analytics, Dashboards and Traditional Reporting – understand your data to improve your products and processes
  • Time to Value – start with pre-configured reports and enable custom reports for your business in a couple of weeks, not months or years
  • Designed for Teamcenter – enable your entire enterprise to easily understand the information they require to make better decisions
  • Self Service Analytics – enable data discovery through self service analytics designed for Teamcenter and optimized to your needs

256px-caught_between_a_rock_and_a_hard_placeThere they were, sailing along their merry way. Toward the horizon, a narrow strait approaches. As the boat gets closer, they notice a couple of strange characteristics; to one side a cliff and the other a whirlpool. Upon arrival, it becomes apparent that this is the cliff where the monster Scylla dwells. Looking to the other side, the monster Charybdis, spewing out huge amounts of water, causing deadly whirlpools. Each monster is close enough that to avoid one means meeting the other. Determined to get through, our intrepid hero Ulysses must make a decision.  The idiom “Between Scylla and Charybdis” comes from this story.  In more modern terms, we would translate this to “the lesser of two evils.”

PLM administrators, engineering managers, and IT teams are often give this same choice with equally deadly – well, unfortunate – outcomes. What is this dilemma? Customize the PLM system (beyond mere configuration) to match company policies and processes, or change the culture to bend to the limitations posed by “out of the box” configurations.

Companies will often say something to the effect of “We need the system to do X.” To which many vendors meekly reply “Well, it can’t exactly do X, but it’s close.” So what is a decisionmaker to do? Trust that their organization can adapt? Risking lost productivity and possibly mutiny? Or respond by asking “What will it take to get it to do X?” incurring the risk of additional cost and implementation time.
source-code-583537_1280

We can further elaborate on the risks of each.  When initially developing the customizations, there is the risk of what I call “vision mismatch.”  To the best ability, X is described with a full understanding of the bigger picture that is missed when the developer writes up the specification.  This leads to multiple revisions of the code and frustrations on both sides of the table.  Then, customizations have the longer-term risk of “locking” into a specific version.  While gaining the benefits of keeping your processes perfectly intact, the system is stuck in time unless the customizations are upgraded in parallel.  Some companies will avoid that by never upgrading…until their hardware, operating systems, or underlying software systems become unsupported and obsolete. Then the whole thing can come to a crashing halt.  Hope the backups work!

office-1209640_1280However, not customizing has its own risks. What if the new PLM system is replacing an older “homegrown” system that automated some processes that the new system cannot? (And a “homegrown” system comes with its own set of risks; original coder leaves the company, never commented code, no specifications, etc.)  For example, raising an issue automatically created an engineering change request while starting a CAPA process. The company has gained a manual process, thus exposing them to human error. Or, perhaps the company has policy that requires change orders go through a “four-eyes” approval process, to which the new system has no mechanism to support such a use case.

Customizing is akin to Charybdis, whom Ulysses avoided, deciding that it is better to knowingly lose a few crew members rather than risk losing the entire ship to the whirlpools. Not customizing  is more like Scylla, where there is lower risk, though a much higher probability to the point of almost certainty.

We’ve been through these straits and lived.  We’ve gone through with many companies, from large multi-nationals to the proverbial “ma and pa” shops.  Let us help you navigate the dangers with our PLM Analytics benchmark.

Enterprise-wide PLMdata systems hold huge amounts of business data that can potentially be used to drive business decisions and effect process changes to generate added value for the organization.  Several PLM users are unaware of the existence of such valuable data, while for others, advanced data search and retrieval can feel like looking for a needle in a haystack due to their unfamiliarity with the PLM data model. Hence it is important to process the data into meaningful information and model it into actionable engineering knowledge that can drive business decisions for normal users. Reporting plays a key role in summarizing that large amount of data into a simple, usable format for the purpose of easy understanding.

Reporting starts with capturing the right data – the most important step and, many a time, the least stressed one. When data is not captured in the right format, it results in inconsistent or non-standard data.

Let’s take a simple Workflow process example: Workflow rejection comments are valuable information for companies to understand the repeated reasons for workflow rejection and to improve FTY (First time yield) by developing training plans to address them.  Users might not enter rejection comments unless they are made mandatory, so it’s important to have data-model checks and balances to capture the right data and standardize it through categorization and LOVs (List of values).

reportThe next step is to filter and present the right information to the right people. End Users typically want to run pre-designed reports and maybe slice or dice the data to understand it better. Business Intelligence Designers and Business analysts who understand the PLM Schema and their business relationships are the ones who design the report templates. Report design is sometimes perceived as an IT or software function, and as a result, enough business participation is not ensured, which can have an impact on the effectiveness of the reports for end users. It is important to have business participation from report identification to report design to report usage. Business process knowledge is the key in this area, not the PLM tool expertise alone.

Since business processes get improved/modified based on different market and performance trends derived from PLM reports, it’s important to have continuous improvement initiatives to fine-tune reporting based on these improved processes and new baselines, from data capture to presentation. That makes it a continuous cycle – business processes need to be designed to support reporting and reports need to help improve the process.

Properly designed reports provide increased visibility into shifting enterprise wide status, reduce time and cost for data analysis, ensure quicker response times and faster product launch cycles and improve product quality and integrity.

How do your reports measure up? Do you have any questions or thoughts? Leave a comment here or contact us if you’re feeling shy.

ilogic-snipSometimes CAD can be used to start establishing PLM practices. Since PLM systems rely on data to be effective, ensuring consistent and correctly-entered information is paramount. Things like classification with properties and meta-data can rely on CAD very heavily to be effectively used. For example, let’s consider the classification and data for a machined part. If the part is going to require machining, we could assign it a classification of “Machined.” Since the part is going to be machined, we would want to ensure that “Stock Size” is one piece of meta-data to be tracked. Most CAD systems have a way to ensure this “Stock Size” is at least filled out, and some could even be automated to calculate the stock size without any user intervention. Of course a repeatable logic would need to be utilized, but once that is done, time spent completing stock size calculations and potential errors would be eliminated.

 

Case in point: Utilize iLogic in Autodesk Inventor to calculate stock size for machined parts. Once this is done, users can forget about manually checking all the measurements; all they need to do is flag the part as “Machined” and the system does the rest!

boltvolume

You have probably heard about Teamcenter. You know it is used extensively at the major OEMs. But what does it actually do? How is it made? This article will explain, from the very beginning, what Teamcenter is and how it is put together.

Teamcenter is a Product Lifecycle Management (PLM) platform and is primarily designed to support the design and development of products that subsequently get manufactured. Siemens PLM Software gives a single-picture representation of Teamcenter as follows:

Teamcenter

Firstly, Teamcenter is a platform, meaning that it houses multiple capabilities (apps) in a single seamless piece of software. A user logs into one interface and is able to access all the installed and licensed capabilities of their Teamcenter from there. The platform is based on a database, allowing for storage and indexing of data.

Secondly, the platform is scalable to suit the requirements of each organization using the technology. In fact, Siemens PLM Software have categorized the apps into three groups: Start, initial deployment; Extend, increase functionality; and Transform, move to advanced capabilities.

Thirdly, Teamcenter grew out of the necessity to store and maintain all the digital data generated during product development after widespread adoption of CAD applications. It now has extensive capabilities beyond that.

Let us look at the three groups and define at a high level each one of the apps: […]

So you’re an executive at a manufacturing company. You make things that are useful to your customers and you return profits to ever-demanding shareholders. You have probably heard of PLM before; perhaps your staff have mentioned the acronym. But how badly do you need it?

Here are 10 indicators that you definitely need PLM:

  1. Your engineering organization is often late meeting customer deadlines. This results from poorly executed projects, inefficient processes and lack of clear deliverables. All of these problems can be addressed by a PLM system supporting the engineering organization.
  2. Warranty costs are creeping up. One of the largest contributors to poor product quality is sloppy design and incomplete engineering definition. Installing appropriate PLM technology to support design activities results in a better specification been communicated to manufacturing.
  3. Factory scrap rates are above industry standards. For example, scrap and rework is often traced back to a wrong drawing, an incorrect dimension or a poorly specified component. Complete and accurate product design is supported by a robust PLM system.
  4. R&D costs as a percentage of revenue are excessive. Engineering and design activity is bloated with too much headcount and overhead. Yet they are late with deliverables. PLM means efficiency in R&D.
  5. The organization struggles with coordination. It appears as if manufacturing and engineering are always at odds with both departments blaming one another for mistakes. PLM can offer objective data to resolve these issues.
  6. There is no accountability in the organization. It is difficult to diagnose where mistakes were made and who is responsible. People are always blaming other departments. A PLM system can provide objective data that allows the root cause to be addressed.
  7. Expedited freight costs are bleeding away your profits. Excessive expedited freight costs are common in companies that are late with deliveries and have to ship under duress to avoid customer penalties. Better upstream engineering supported by PLM can improve this problem.
  8. Your competitors always beat you to market with new products. Is innovation management and new product introduction a problem for your organization? A better PLM system can make dramatic differences in this area.
  9. Customers complain that they do not get the information they need. You owe your customers information at various stages during the engagement cycle and they never get it in a timely manner. A suitably configured PLM system can improve this dramatically.
  10. Your suppliers provide the wrong information. This can be a common problem diagnosed by your engineering staff. But do your suppliers have the right request to begin with? PLM technology can bridge this gap

Do you have three or more of these issues keeping you up at night? Time to take a serious look at a PLM system.

© Tata Technologies 2009-2015. All rights reserved.