Industry Insights

Read articles by industry leading experts at Tata Technologies as they present information about PLM products, training, knowledge expertise and more. Sign-up below to receive updates on posts by email.

Product Excellence Program helps Siemens PLM Software to understand how customers use their products and assists them in improving the software in future releases .The Product Excellence Program is designed to protect the privacy of the user and the intellectual property created through the use of Siemens PLM Software products. It’s used to collect data about Siemens PLM Active Workspace product usage and associated Teamcenter platform installation. Data collection occurs in the background as software is used and does not affect performance or functionality,  collected data is sent to Siemens PLM Software for analysis. Per Siemens PLM no contact information is contained in the data collected not any information about data created or managed is collected. Data is solely for use by Siemens PLM Software and is never shared with third parties .

Participation in the Product Excellence Program is enabled by default during installation using either TEM or Deployment Center. System administrators can always opt out during install. Post install, participation can be controlled  with the TC_ProductExcellenceProgram site preference.  All data collection is anonymous and includes product usage; Teamcenter server platform (version, platform, architecture), client environment (browser type, version), client page visits and collected data is sent from the client browser.  

My last blog focused on the need for a Manufacturing BOM (mBOM). When organizations start to embrace the value of mBOM and  decide to invest on solutions to manage a mBOM, the first question is where to master it , PLM or ERP ?

The answer to that question varies depending on the maturity level of PLM and ERP adoption and penetration in the organization .  If both PLM & ERP are at the same or similar maturity level, then there are many good reasons to author & manage mBOM in a PLM system and to make ERP a consumer of the mBOM mastered in PLM.

First, in PLM mBOM is integrated with the eBOM and design process . eBOM integration and reuse enables front loading, and helps manufacturing team to lower cost of mBOM authoring and management and shorten time to market.  Manufacturing users can also leverage the 3D visualization data in mBOM for better decisions and  better quality. With the master model approach being adopted by leading organizations, there is lot of Product Manufacturing Information (PMI) on the 3D Master Model, which can be leveraged in both mBOM and downstream process planning.  mBOM can also act as the starting point for detailed process planning to create the Bill of Process (BOP) inside PLM . BOP or Routing can also leverage the 3D visualization data to produce visual work instructions , which will always remain updated with the upstream design changes. The process plans can  also be simulated and validated (feasibility, human ergo, collision etc) before actual execution.  The validated Routing then get sent to Manufacturing Execution Systems (MES) along with the visual work instructions. That way there  is full traceabilty from CAD to eBOM to mBOM to BOP and eventually to MES.

The traceability enables users to run where used queries among all products and plants during a change process. This ensures all product changes are evaluated for impacts in both engineering and manufacturing contexts.

 

Embracing a true PLM platform and solution is not an easy endeavor for many companies, even with the reckoning of the potential value and ROI offered by a rightly architected PLM solution.  Success in any Enterprise software implementation like PLM often requires careful planning, dedicated resources , right technical expertise, executive sponsorship, and a receptive culture, among other things.  When done the right way the results of such efforts are transformational, producing significant business benefit which can be measured and validated.

One of the biggest challenges to adopting PLM is organizational change management given the breadth and scale of a true PLM solution . Many companies approaches it in phases and rightly so; but the key is how the phases are architected, tracked and measured.  PLM involves managing and linking Data, Processes  and People together as the product goes through it’s lifecycle from inception to design to manufacturing to support and eventually end of life.   The first step of this is often managing Data; specifically Engineering CAD data.  Most solutions start with a way to vault the CAD data along with some basic part numbering schemes and revision rules . Sometimes engineering documents are also vaulted along with the CAD data.   Yes data  vaulted in a central repository brings  lot of benefits like elimination of duplicates , basic check-in-checkout / access controls and  added search capabilities as opposed to it scattered across multiple locations.  But the measured value of this alone may not substantiate the heavy PLM IT investment companies needs to make for a true scalable PLM platform.   Sometimes there is an expectation misalignment on the full PLM value and just the data vaulting value . This at times sends companies to a long and lull “PLM assessment” period  after data vaulting.  Sometimes cultural resistance or organizational change overturns any momentum.  Maybe a technical glitch or integration shortfall previously overlooked becomes a deal breaker . Over-scoped and under supported initiative can also run out of money or time.

Companies make a considerable amount of IT investment on the PLM platform upfront, so that they have a scalable solution for all phases and not just CAD vaulting.  Most of the time they can add more capabilities and processes on the PLM platform without additional IT investments .  So it’s very important to get past the initial data vaulting phase and move to the next phases to maximize the utilization of existing IT investments.  Now the question is where do we go after CAD vaulting. This is where upfront PLM Roadmap definition is so important in terms of  how the phases are architected, tracked and measured.  For companies who have successfully completed data vaulting but do not have a formal PLM Roadmap defined yet, some of the next focus areas to consider can be Engineering process management, BOM Management,  Change management , Requirements management , Project and Program management , in no specific order.

Anyone who has dealt with Bill of Materials (BOM) knows about the challenges and complexities involved with it. Sometimes we get asked, managing a single BOM itself is cumbersome, then why do we even need another one in the form of a Manufacturing Bill of Material (mBOM)..?

What we have seen with our customers  is that, when there is only one BOM then it is usually owned by the engineering department (CAD BOM/ eBOM) and will be available for  the Manufacturing Department as  a “read only”. This is not good enough for the manufacturing teams as they need to author and add data specific to manufacturing , for example  manufacturing specific consumable parts like glue, oil or Tool Fixtures and such. Another key factor is how the BOM is structured; typically eBOM is structured around organization systems and functions and represent the product architecture, but for manufacturing team a mBOM needs to be  organized according to the manufacturing assembly order.

When customers need to work towards the industry 4.0 goal, they need to have  smarter manufacturing  solutions and systems that provide more ways to capture the manufacturing business intelligence and then suggest solutions based on the previous patterns. With this in mind they need to invest in  manufacturing BOM authoring and management area. During a mBOM adoption, the key is not to recreate the data that’s already in eBOM, but to reuse the eBOM and add additional information specific to manufacturing. That way there is both reuse and traceability of the data.

At a high level mBOM creation automation solutions exist in multiple flavors

  1.  Recipe based mBOM:  In a recipe based mBOM, customers can initiate the mBOM creation via pre-configured  templates pointing to eBOM. Based on the recipe stored with the template it will automatically fetch the engineering parts into mBOM. This kind of solution helps customers who have heavy standardization in their product offerings.
  2. Reusable Manufacturing Assembly: In such a solution, customers can leverage the same manufacturing assembly across multiple product lines to reduce the design, development and procurement costs
  3. New Offline Processing Solutions: This approach is to tailor the mBOM creation process and application to the customer need using customization. This standardizes and automates the process to capture the business intelligence and its reuse via customization.
  4. Smarter Validations: Such solutions suggests what’s next to the business users, that way users spends less time discovering the problem and more time solving it.

Over all value of such solutions is not just the flexibility it offers the manufacturing team, it also reduces manufacturing process planning and execution lead time with improved structure accuracy and significant reduction in change reconciliation processing time.

More often PLM starts as a CAD/Design data vault for many companies, later evolving to a design data exchange platform .  Most successful companies are taking PLM beyond just a design data exchange and access control platform; to a knowledge driven decision support system.  This means PLM not only needs to manage the multitude of information generated at various stages of the product lifecycle , but also capture the product development knowledge and feed it back to the product lifecyccle. For example, the requirements and design for a newer version of a product  needs to be also driven by the knowledge elements captured from the previous version’s lefecycle, from inception to design to manufacturing and service.

When PLM stays just in the Design Engineering world, it’s constrained to exchange information and capture knowledge from downstream stages managed by disconnected, silo based systems. This results in engineers spending huge amount of time in data acquisition tasks. Industry studies shows that information workers spend 30-40% of their time only for information gathering and analysis, thus wasting time in searching for nonexistent information, failing to find existing information, validating the information or recreating information that can’t be found.

Quality escapes is another challenge with such disconnected systems when product doesn’t confirm with the engineering definition. Non-conformances found on the shop floor  are costly to review and dispose and even more severe when the product is already on service. Reconciling change is also extremely challenging, especially its downstream propagation, resulting in significant productivity losses. Slow change processing along with quality escapes cause delays in new product introduction affecting the overall ability of the companies to compete.

The first step towards transforming PLM to a true knowledge driven decision support system is to extend it to the CAD/CAM/CNC process chain, thus taking it to the shopfloors. Such a solution helps to establish a  continuous loop from Engineering into the shop floor for operations management and manufacturing execution systems (MES). Such a continuous loop system provide more ways to capture the business intelligence and then suggest solutions based on the previous patterns. Then it’s much easier to capture information and use analytics to synthesize valuable knowledge elements compared to the fragmented solutions many companies have today.  It’s also a foundational element for establishing a Digital Twin per Industry 4.0 vision

 

Other key benefits of extending PLM to manufacturing include

Reducing the time to market

  • Enhanced collaboration between Product and Manufacturing Engineering
  • Enhanced Traceability and Faster Change Management

Enhancing Flexibility

  • Manufacturing plans comprehend product variability/complexity
  • “What if” scenarios for optimized decision making

Increasing Quality

  • Manufacturing Simulation and validation integrated in PLM
  • Up-to-date 3D work instructions delivered to the shop floor

Increasing Efficiency

  • Ongoing process optimization based on Closed loop feedback of utilization data
  • Reuse of common methods/tooling

You have probably heard about 3D EXPERIENCE. Or you may have heard about “The Platform”. But what does it actually do? How is it made? This article will explain, from the very beginning, what the 3D EXPERIENCE Platform is all about.

The Platform is created by Dassault Systemes and is their central product for the future. Here is a quote from the 3DS website:

“Dassault Systemes, the 3DEXPERIENCE Company, provides business and people with virtual universes to imagine sustainable innovations”

3D EXPERIENCE Platform is primarily a Product Lifecycle Management (PLM) system and is aimed at supporting the digital design and development of products that subsequently get manufactured. This explains the references to “virtual” and “innovations” in the quote. As a platform, it houses multiple capabilities (apps) in a single seamless piece of software. A user logs into one interface and is able to access all the installed and licensed capabilities of their 3D EXPERIENCE from there. The platform is based on a database, allowing for storage and indexing of data.

Dassault Systemes explain their platform in terms of a 3D compass illustrated below:

 

Let us look at the four quadrants, starting at the top and work around the quadrant in a clockwise direction:

  1. Collaboration apps include functionality that foster informal collaboration across extended teams (SWYM) and structured collaboration such as formal change management (ENOVIA)
  2. Information intelligence apps are designed to handle Big Data and drawing from multiple sources, present the user with concise summaries of the information they need (NETVIBES)
  3. Simulation apps are aimed at virtual digital validation and testing of designs. Traditionally this is known as CAE or FEA (SIMULIA). It can also be extended to virtual simulation of factories or retail store layouts.
  4. 3D Modeling apps are perhaps what Dassault Systemes is best known for and include CATIA and Solidworks

Finally, because all this functionality is in a single platform, this allows the user a realistic and immediate experience in a virtual world (Real time Experience).

A few more pertinent point regarding the 3D EXPERIENCE Platform:

  1. The platform is scalable to suit the requirements of each organization using the technology. Adopters can chose to start with basic functionality and then move to more advanced capabilities.
  2. The concept of a design platform grew out of the necessity to store and maintain all the digital data generated during product development after widespread adoption of CAD applications. It now has extensive capabilities beyond that.
  3. Dassault Systemes are constantly added new apps and functions to the platform, so the range of capability is expanding.
  4. More and more of the platform is moving to the cloud. Dassauly Systemes now offer a complete SAAS model for a lot of apps within the platform. This dramatically reduces implementation complexity.

Obviously, each application included in the platform is comprehensive enough to warrant a complete subject in and of itself, but I hope this breakdown has given you a useful high-level overview of what 3D EXPERIENCE can do for an organization, not just at initial implementation but in terms of adapting to its changing needs over time

 

 

Robots, automation, and what some would even call “artificial intelligence” are everywhere around us today.  From factories to automobiles; our smart phones to our refrigerators; digital life today has had a profound impact over recent decades.  This doesn’t appear to be slowing down anytime soon either.

Automation and robots in factories have replaced many of the assembly line jobs, and this has also lead to a great improvement in quality and efficiency.  Instead of assembly line workers, we now have robot technicians and programmers.

The big question is “Why?”.  And the answer is generally along the lines of improved efficiency due the need to stay competitive.

It could be argued that we are seeing the same thing in the engineering environment with design automation and the associated engineering intelligence built into repeatable design types… or automated workflows that are electronically managed to ensure process repeatability, quality and efficiency.

This begs the question:  Can companies survive in today’s market without investing in their business and engineering processes like they also have or must do in manufacturing?

There was a day when it was unlikely that a company would buy a 3D CAD system without extensively evaluating it.  They required demos, trials, benchmarks, pilot projects and extensive financial ROI analysis.  Are those days gone?  Early in my career, I made a living by simply being able to demonstrate relatively new 3D CAD technology.  These days, a demo is rarely required for purchases of 3D CAD.  Decisions about a company’s core 3D CAD package have generally been previously made, or are now based on data formats of customers or suppliers.

It seems that 3D CAD is simply now an expected part of product development processes and an integral part of PLM in general.  The specific version of 3D CAD doesn’t seem to be nearly as critical as companies previously expected them to be.  Most can now get the job done in small to mid-size companies, with minor differences depending on the specific situation.

There does still seem to be a “pecking order” for the various CAD systems in the manufacturing sector.  The large companies with the broadest set of requirements (and the deepest pockets) generally define the standard.  This includes the Automotive and Aerospace OEMs as an example.  Once they settle on a primary CAD system, many other suppliers base their CAD requirements upon the OEM’s decision.  This doesn’t automatically mean the suppliers choose the same CAD system; just that the supplier needs to be able to communicate and exchange data with the OEM in an efficient manner.  Often times, an automotive supplier will obtain a license or two of the OEM’s chosen CAD software, but it will not be deployed across their entire environment.  The “Top-Tier” CAD that the OEM decided upon may only be used to translate and communicate directly with the OEM, while the bulk of their CAD users might be using a “Mid-Tier” CAD system that is perfectly capable of meeting the supplier’s design requirements.  A host of emerging cloud based CAD technology is also available.

 

So what does this mean to the industry?  Focus on the next thing.  Maybe that is a fully electronic PLM environment, or updated NC or additive manufacturing software.  It could be the adoption of up-front simulation technology to accelerate the design cycle.  There are a lot of things from a technology continuity perspective that can still be addressed once the CAD platform has been settled upon.  Just don’t lose sight of other opportunities for continuous improvement once your CAD house is in order.

Typically when new software releases come out, there are always a few really key improvements that really stand out.  Many times, it is a cool new modeling feature, or maybe an entirely new approach to design.  In Inventor, this might be like the addition of Freeform Modeling or Direct Editing as examples.  Unfortunately these are features or techniques that might not be applicable to many users.

If you are using both Autodesk Inventor and Vault together however, you should probably pay attention to this one:  The Vault status icons in the “recently used” area.  These icons now clearly identify the current Vault status of one of your recent files when in the Inventor “Open” dialog box.  Is the file checked out?  Is the file checked in, and up to date in my workspace? Has someone else modified the file since I last worked on it?  Have I checked in my latest development ideas or new parts yet?  All of these can be determined simply by noticing the Vault status bubbles in the “Open” dialog box.

Vault Status Icons

According to a PLM Foresight Opinion Poll conducted by CIMData, 70% of Senior Executives in manufacturing companies saw little or no value in PLM. This is a troubling statistic as it shows that PLM is not as widely adopted and embraced as it should be. PLM can bring huge efficiency gains to an organization and prevent a lot of errors.

How can you get efficiency from PLM?  One approach is to use a Maturity Assessment. These models investigate issues related to best practices of the system been evaluated using defined “pillars” of major functionality. When their maturity of a given “pillar” is evaluated and measured, this provides current and future capability levels and can be used to identify potential improvements and goals for the system been evaluated. When a maturity model is applied to and shared by a number of organizations in a particular industry, it can provide an industry-specific benchmark against which to evaluate an organization’s maturity with respect to others in the same industry.

So why should a company assess the maturity of their PLM?  This assessment can guide companies to a PLM roadmap that will enable them to improve the state of their current sytem. The roadmap will allow them to deploy appropriate technologies, processes, and organizational changes that enhance the overall product development process from early concept through product end of life. This in turn leads to improved bottom-line and top-line benefits.

Tata Technologies have developed PLM Analytics as a framework to provide information to customers about the state of PLM within their enterprise, the maturity (ability to adopt and deploy) of PLM, and ultimately to the creation of a detailed PLM roadmap that will support their business strategy and objectives. Each component builds on and complements the other components, but can be conducted independently.

What is PLM Analytics?  A high level diagram is shown below:

PLM Benchmark  Helps triage your PLM needs and find out how you stack up against the competition. The resulting report benchmarks performance against 17 industry-standard “pillars” and evaluates the current state and desired future state with an indication of implementation priority. The Benchmark is a consultant-led personal interview with one or more key business leaders. It is toolset agnostic.

PLM Impact Builds a business case and demonstrates how PLM can save you money. Once a PLM Benchmark has been completed, a consultant-led series of interviews with multiple key business leaders can identify multiple savings opportunities. These opportunities are summarized as financial metrics, Payback and ROI. These can be used to provide validation of proposed PLM initiatives and provide decisions makers with key financial data.

PLM Healthcheck  Understand how your current PLM works from the people who use it. The healthcheck surveys a cross-section of your product design team via online assessments to establish PLM health related to organization, processes, and technology. The results identify gaps against best practices, consistency of organizational performance, and prioritize areas of improvement. The healthcheck can be used on a global basis.

PLM Roadmap  A 360°view of your PLM plus a detailed roadmap for pragmatic implementation. The roadmap is constructed from onsite interviews with senior leaders, middle management and end users across product development. These sessions focus on the specific business processes and technologies to be improved and result in a PLM Roadmap, an actionable improvement plan with defined activities and internal owners to ensure successful implementation

By using this suite of tools, Tata Technologies can put you on the road to PLM success!

© Tata Technologies 2009-2015. All rights reserved.