Category "PLM Expert Insights"

According to a PLM Foresight Opinion Poll conducted by CIMData, 70% of Senior Executives in manufacturing companies saw little or no value in PLM. This is a troubling statistic as it shows that PLM is not as widely adopted and embraced as it should be. PLM can bring huge efficiency gains to an organization and prevent a lot of errors.

How can you get efficiency from PLM?  One approach is to use a Maturity Assessment. These models investigate issues related to best practices of the system been evaluated using defined “pillars” of major functionality. When their maturity of a given “pillar” is evaluated and measured, this provides current and future capability levels and can be used to identify potential improvements and goals for the system been evaluated. When a maturity model is applied to and shared by a number of organizations in a particular industry, it can provide an industry-specific benchmark against which to evaluate an organization’s maturity with respect to others in the same industry.

So why should a company assess the maturity of their PLM?  This assessment can guide companies to a PLM roadmap that will enable them to improve the state of their current sytem. The roadmap will allow them to deploy appropriate technologies, processes, and organizational changes that enhance the overall product development process from early concept through product end of life. This in turn leads to improved bottom-line and top-line benefits.

Tata Technologies have developed PLM Analytics as a framework to provide information to customers about the state of PLM within their enterprise, the maturity (ability to adopt and deploy) of PLM, and ultimately to the creation of a detailed PLM roadmap that will support their business strategy and objectives. Each component builds on and complements the other components, but can be conducted independently.

What is PLM Analytics?  A high level diagram is shown below:

PLM Benchmark  Helps triage your PLM needs and find out how you stack up against the competition. The resulting report benchmarks performance against 17 industry-standard “pillars” and evaluates the current state and desired future state with an indication of implementation priority. The Benchmark is a consultant-led personal interview with one or more key business leaders. It is toolset agnostic.

PLM Impact Builds a business case and demonstrates how PLM can save you money. Once a PLM Benchmark has been completed, a consultant-led series of interviews with multiple key business leaders can identify multiple savings opportunities. These opportunities are summarized as financial metrics, Payback and ROI. These can be used to provide validation of proposed PLM initiatives and provide decisions makers with key financial data.

PLM Healthcheck  Understand how your current PLM works from the people who use it. The healthcheck surveys a cross-section of your product design team via online assessments to establish PLM health related to organization, processes, and technology. The results identify gaps against best practices, consistency of organizational performance, and prioritize areas of improvement. The healthcheck can be used on a global basis.

PLM Roadmap  A 360°view of your PLM plus a detailed roadmap for pragmatic implementation. The roadmap is constructed from onsite interviews with senior leaders, middle management and end users across product development. These sessions focus on the specific business processes and technologies to be improved and result in a PLM Roadmap, an actionable improvement plan with defined activities and internal owners to ensure successful implementation

By using this suite of tools, Tata Technologies can put you on the road to PLM success!

You can’t get away from it; IoT, the Internet of Things.  Connected devices everywhere.  It is estimated that there will be over 50 billion connected devices by 2020.  At this rate, our toasters will be connected to the internet! From your smartphone, you can control the heat level, monitor temperature and look for cool spots, watch the carbohydrates carbonize in real time!

While we may not actually see the connected toaster (at least I hope not), many companies are looking into their own IoT strategies.

Manufacturers are no different.  Industrial IoT (IIoT) is helping manufacturers learn more about their own products.  All this information is helping to create brand new business models, selling outcomes instead of products.  GE is probably the most notable example of this, where they sell flight hours instead of jet engines.  With an immense amount of data generated by sensors on these engines, GE is able to predict failures and analyze performance to fine tune operation.

The question now, is not how can a company utilize this type of operational feedback, but when.  Manufacturers will have to understand how that data fits into their systems engineering.  As of now, this would require a company to apply the digital thread and digital twin philosophy.  In short, the digital twin first accounts for the PLM’ish information leading up to delivery (requirements, simulations, designs, manufacturing processes, etc.).  Then, each physical product will have an instance of the digital twin to record what happens after delivery.  This includes as-built conditions, service modifications, and, here is the IoT connection, operational parameters.

The market may not yet be ready for this level of digitalization (a term I first heard here, at least in this context).  So many companies are still using shared drives and spreadsheets as the PLM tools of choice.  What we discovered is that if a company wants to have a IoT strategy, they have to understand what that entails.  Will the company be ready to change their culture fast enough?  IIoT is a great feedback loop.  However, there has to be something to feedback all that data to.  The promise is there; I think people understand the benefits.

To answer the original, I think PLM may be ready for it, as it matures and incorporates systems engineering, simulation-in-the-loop, and other enabling technologies.  However, I don’t think the market is ready for the marriage of IIoT and PLM.

In a world of controlled, strict, non-flexible systems, people start to get creative.  For some, it’s the crushing weight of a massively customized ERP system that somehow spread out to every part of the organization, for others circumventing “inconvenient” safety devices; when things need to get done, sometimes we have to take matters into our own hands.  In the IT world, this is called “Shadow IT,” which is basically any app, software, or program that isn’t under the control (or even known to) the IT department. Users downloading freeware, buying their own software, using cloud services, etc. Even NASA has difficulty reining in their employees when it comes to using non-sanctioned software.

This behavior extends into the design and engineering office as well, perhaps moreso than other parts of an average organization. It’s in their nature to solve problems; it’s kind of the key attribute of an engineering job description! I can understand why – engineers live and breathe efficiency, and being over-encumbered by poorly designed systems is not efficient at all.

Case in point: the Bill of Materials (BOM). How many systems are controlling it? ERP? The design tool? PDM?  Some home-grown custom system?  By and large…no.  Most work-in-process BOMs are done in spreadsheets.  And why not?  Spreadsheet applications are easy to use and share, don’t require much training, and don’t require six levels of approval to implement. Spreadsheets can even be automated with macros, making BOMs configurable and intelligent. Eventually all items do end up in the ERP, though typically not until late in a product’s/project’s/program’s lifecycle.

So, why not stay with the spreadsheets? What if someone edits those macros and an unknowing user doesn’t verify the end result? What if the file gets corrupted? How does the rest of the organization gain visibility to the latest changes? What ensures that the BOM is up to date with the design? Ultimately, the BOM should be controlled in a PLM system, much to chagrin of clever engineers everywhere.  Here’s why.

Just as when the market moved from pencil drawings, french curves, vellums, and other manual drafting techniques to 2D CAD tools, and similarly 2D design to 3D modeling: Change.  Yes, sketching a part is faster than creating the lines and circles – but CAD technology enables updates caused by change much faster than a manual process. The gains of going from 2D designs to 3D models are more staggering.  “You mean one edit and it updates everywhere?  Even the drawing?”  Anecdotally, I had a customer say “With 2D, whenever the shop called, I was worried about what we messed up.  Now, with 3D models, when the shop calls, I worry about what they messed up.

Again, it’s about rate of change, propagation and validation of the change. Spreadsheets cannot do that (unless you have some really wicked cool macros).

With our PLM Analytics Benchmark, we can help your company to assess the nature of your BOM needs, as well as the 16 pillars of PLM. Let us know if we can be of service!

Product development companies need to manage a wide variety of documents in different formats and types as they design, manufacture and support their products. Gone are the days when paper documents used to run businesses. Today everything is digital, but very often these digital documents related to product and product development are created in siloed environments disconnected from product development processes. Document authors often recreate or reenter information from product development into their documents.

If the document authors don’t have visibility into the latest product changes, documents become out of sync with product updates. This impacts critical business processes due to inaccuracies or lack of current data. For organizations working globally, another challenge is the high cost and time involved in building complex documents that have multiple language/regional and regulatory requirements.

Teamcenter addresses this challenge by enabling documents that relate to and support product development to be stored alongside product data and processes. When documents are managed in the context of product data related to parts, or to other documents, companies have a single version control, access control and process control system for the entire enterprise, including product data and documents.

Source material from product data can be accessed and used to create documents like parts catalogs, work instructions, service material, specifications for suppliers, trade studies, or even regulatory filings. The documents can then be delivered  as appropriate to the end user in the required format, whether as a PDF or HTML web page, an interactive web tool, or exchanged with customers or suppliers using an industry standard.

The Teamcenter document management solution is focused on improving the document quality while streamlining the process of document creation and delivery. One of the central themes to this is “Transparent PLM.”

In a transparent PLM approach, users continue to do all their document work in their existing document authoring tools, the like Microsoft Office product suite.  They can also do the PLM activities – including review, approval or version or effectivity tracking, etc – directly from the same Office products.   With users continuing to work with document tools in which they are already proficient, they become more productive and the learning curve involved with a new PLM tool is eliminated. This helps with easy user adoption of the solution without any formal training requirements. […]

There is an excellent story in leadership consulting lore. I’m not sure how true it is, but the lessons derived from it are incredibly valuable.

There was once a detachment of Hungarian soldiers that struck out on a reconnaissance mission from their platoon in the Alps. While they were out, there was a massive snowstorm and the soldiers lost their way – returning was impossible.  The team was worried; they were not prepared for an extended stay out in these harsh conditions, and even if they had been, how would they get back with no knowledge of their location? They had all but given up hope when one soldier, while rummaging through his uniform, found a map. He showed it to the group and a new =found sense of hope came over them. They rallied together, found shelter, and waited out the storm.

After a couple of days, the blizzard finally let up. Wearily, the soldiers set about returning to their platoon. Using the map, they identified various features of the land, and made their way back. Their commander was elated to see them alive and well. When he asked the team how they did it, the soldier showed the commander the map that had not only guided them back, but had also given them the hope to persevere.  Confused, the commander asked this soldier, “How on earth did you find your way using a map of the Pyrenees?”

This story teaches us many things; here are two:

  • Fear and anxiety can lead people to inaction, even to their own detriment (and the effect usually intensifies in groups)
  • Even with the wrong strategy or plan, the chances of success are higher than if there were no plan at all

The second point has many application in the business world.  One I think of most, in terms of our manufacturing customers, is that of their shop floors.  Often manufacturers, especially small and medium sized ones, don’t have a chance to get deep into process planning.  Stations are haphazardly placed, too many or not enough activities are scheduled at stations, new machinery is placed wherever it fits, etc.  All of this causes bottlenecks and a slower time getting things out the door.  As we all know, time is money – especially in manufacturing, where every lost minute, hour, or day translates into lost revenue.

Tata Technologies has an amazing team of technical experts and works with many solution providers that can help manufacturers find their own map. One of the maturity benchmarks we offer is for the “Digital Factory;” contact us to schedule yours.

 

This post was originally written in January of 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Electron Beam Freeform Fabrication (EBF³) .

What is Electron Beam Freeform Fabrication?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques. Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in a 2D cross-section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects. The most common, and the first, technology of this type to be developed is Fused Deposition Modeling.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement by allowing other companies to commercialize the technology.

Electron Beam Freeform Fabrication, or EBF³ is one of the newest forms of rapid prototyping. This technique is performed with a focused electron beam and a metal wire or filament. The wire is fed through the electron beam to create a molten pool of metal. The material solidifies instantaneously once the electron beam passes through, and is able to support itself (meaning support structures generally aren’t required). This entire process must be executed under a high vacuum.

Pioneered by NASA Langley Research Center, this process is capable of producing incredibly accurate parts at full density (other additive manufacturing techniques have trouble achieving, or require secondary operations to achieve similar results). This is also one of the only techniques that can be successfully performed in zero gravity environments.

What Are the Advantages of this Process? […]

Are you faced with a complex data migration or translation? Do you have years of legacy data that needs to be migrated to a new system? Have you got old CAD data from a outdated system that is still being used?

If you have answered yes to any of these questions, you are facing the prospect of performing a migration or translation project. Here are 10 potential problems that you must look out for before starting:

  1.  Underestimation of effort – too many projects are underestimated, primarily because the use cases for the translation are thought to be simpler then they actually are. For example, assemblies only need translation until someone remembers that drawings need to be included.
  2.  “Everything” syndrome – Looking at a project, most organizations default to attempting to translate or migrate everything. In all cases, this is not necessary, as only a subset of the data is really relevant. Making this mistake can drive up both cost and complexity dramatically.
  3.  Duplicate data – of everything that needs to be moved, how much of it is duplicate data (or same data in slightly different forms)? Experience shows that duplicate data percentages can be as high as 20 to 30 %. Unfortunately, identifing these duplicates can be difficult, but there are techniques to overcome this problem
  4.  Accuracy of CAD translation – When looking at 3D CAD translations, how accurate a copy do the translated models need to be relative to the originals? Again, a blanket requirement of “identical” can drive up cost and complexity hugely. Some lesser target (say +- 2 mm) can improve success.
  5.  Data already exists in Target – Some level of informal manual migration may have already occurred. So, when a formal migration is performed, data “clashes” can occur and result in failures or troublesome duplicates.
  6.  Automatic is not always best – Developing an automated migration or translation tool can be costly, if the requirements are multiple. Sometimes, a manual approach is more cost-effective for smaller and simpler cases.
  7.  Data Enrichment – Because the source data was created in an older system, it may not have all the properties and data that the target system requires. In this case, these have to be added during the migration or translation process. Forgetting about this step will prevent users from accurately finding data later.
  8.  Loss of Data – For large data volumes, is it possible that some of the data is missed and deleted during the project? Very possible – to prevent this requires exhaustive testing and planning.
  9.  Archive Solution – Once the translation or migration is complete, what happens to the original data? In some cases it is possible to delete it. However, in some environments (e.g. regulatory situations) this may not be allowed. In such a case, has an archive solution been put in place?
  10.  Security – Legacy data may be subject to security (ITAR, competitive data, etc.). Does the migration or translation process expose sensitive information to unauthorized users? Often a process will take the data out of its protected environment. This problem has to be considered and managed.

Ask these questions before translations and migrations begin!

This post was originally created in January 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Electron Beam Melting (EBM).

What is Electron Beam Melting?

It is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular based additive manufacturing techniques start with a bed of powdered material. A laser beam or bonding agent joins the material in a cross section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering.

The Selective Laser Sintering Technique was developed in the mid-1980s by Dr. Carl Deckard and Dr. Joseph Beaman and the University of Texas at Austin, under DARPA sponsorship. As a result of this, Deckard and Beaman established the DTM Corporation with the explicit purpose of manufacturing SLS machines, and in 2001 DTM was purchased by their largest competitor, 3D systems.

Electron Beam Melting is very similar to Selective Laser Melting, though there are a few distinct differences. EBM uses an electron beam to create a molten pool of material, to create cross-sections of a part. The material solidifies instantaneously once the electron beam passes through it. In addition, this technique must be performed in a vacuum. This is one of the few additive manufacturing techniques that can create full density parts.

What Are the Advantages of this Process?

EBM is quick; it’s one of the fastest rapid prototyping techniques (though, relatively speaking, most techniques are fast). In addition, it can potentially be one of the most accurate rapid prototyping processes, the major limiting factor being the particle size of the powdered material.

As mentioned previously, this is one of the only additive manufacturing techniques that yields full-density parts; this means parts created with EBM will have similar properties to parts created using traditional manufacturing processes.

Another advantage of the material bed is the ability to stack multiple parts into the build envelope. This can greatly increase the throughput of an EBM machine.

What Are the Disadvantages of this Process? […]

This post was originally created in January 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Selective Laser Melting (SLM).

What is Selective Laser Melting?

It is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular based additive manufacturing techniques start with a bed of a powdered material. A laser beam or bonding agent joins the material in a cross-section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross-section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering.

The Selective Laser Sintering Technique was developed in the mid-1980s by Dr. Carl Deckard and Dr. Joseph Beaman and the University of Texas at Austin, under DARPA sponsorship. As a result of this, Deckard and Beaman established the DTM Corporation with the explicit purpose of manufacturing SLS machines; in 2001, DTM was purchased by their largest competitor, 3D Systems.

SLM is a similar process to SLS, though there are some important differences. Instead of the substrate being sintered, it is melted to fuse layers together. This is typically done in a chamber with an inert gas (usually Nitrogen or Argon), with incredibly low levels of oxygen (below 500 parts per million). This is to prevent any unwanted chemical reactions when the material changes its physical state. This technique yields higher density parts than any sintering process.

What Are the Advantages of this Process?

SLM is quick; it is one of the fastest rapid prototyping techniques (hough, relatively speaking, most techniques are fast). In addition, it can potentially be one of the most accurate rapid prototyping processes, the major limiting factor being the particle size of the powdered material.

As mentioned previously, this technique yields higher density parts than other additive manufacturing techniques, making for a much stronger part.

Another advantage of the material bed is the ability to stack multiple parts into the build envelope. This can greatly increase the throughput of a DMLS machine.

What Are the Disadvantages of this Process? […]

Read Part 1 here.

So, what does a structured process to data migration and translation look like?

First a few definitions:

  • Source system – the origin of the data that needs to be translated or migrated. This could be a database or a directory structure.
  • Target system – the final destination for the data. On completion of the process, data in the target should be in the correct format.
  • Verify – Ensure that data placed in the target system is complete, accurate, and meets defined standards.
  • Staging area – an interim location where data is transformed, cleaned, or converted before being sent to the target.

The process consists of five steps as shown below:

picture1The process can be described as follows:

  • Data to be migrated is identified in the source system. This is an important step and ensures that only relevant data is moved. Junk data is left behind.
  • The identified data is extracted from the source system and placed in the staging area.
  • The data is then transformed into a format ready for the target system. Such transformation could be a CAD to CAD translation, a metadata change, or a cleaning process. Transformation may also entail data enrichment – for example, append additional properties to the objects so they can be better found in the target system.
  • Transformed data is then loaded into the target system. This can be done automatically via programs or manually, dependent on the chosen method. Automatic routines can fail and these are flagged for analysis and action.
  • Once data is loaded, validation is carried out to ensure that the migrated data is correct in the target system and not corrupted in some fashion.

The process as described above is shown at working level:

picture2

Shown in this diagram are two software tools – extractors and loaders. These are usually custom utilities that use APIs, or hooks into the source and target systems, to move the identified data. For example, an extractor tool may query a source PLM system for all released and frozen data that was released after a given date. Once this search is complete, the data identified by this will be downloaded by the extractor from the PLM system into the staging area.

In a similar manner, a loader will execute against a correct data set in the staging area and insert this into a target system, creating the required objects and adding the files.

It is highly recommended that pilot migrations be carried out on test data in developmental environments to verify the process. This testing will identify potential bugs and allow them to be fixed before actual data is touched.

Such a structured process will guarantee success!

© Tata Technologies 2009-2015. All rights reserved.