Category "PLM Expert Insights"

Product development companies need to manage a wide variety of documents in different formats and types as they design, manufacture and support their products. Gone are the days when paper documents used to run businesses. Today everything is digital, but very often these digital documents related to product and product development are created in siloed environments disconnected from product development processes. Document authors often recreate or reenter information from product development into their documents.

If the document authors don’t have visibility into the latest product changes, documents become out of sync with product updates. This impacts critical business processes due to inaccuracies or lack of current data. For organizations working globally, another challenge is the high cost and time involved in building complex documents that have multiple language/regional and regulatory requirements.

Teamcenter addresses this challenge by enabling documents that relate to and support product development to be stored alongside product data and processes. When documents are managed in the context of product data related to parts, or to other documents, companies have a single version control, access control and process control system for the entire enterprise, including product data and documents.

Source material from product data can be accessed and used to create documents like parts catalogs, work instructions, service material, specifications for suppliers, trade studies, or even regulatory filings. The documents can then be delivered  as appropriate to the end user in the required format, whether as a PDF or HTML web page, an interactive web tool, or exchanged with customers or suppliers using an industry standard.

The Teamcenter document management solution is focused on improving the document quality while streamlining the process of document creation and delivery. One of the central themes to this is “Transparent PLM.”

In a transparent PLM approach, users continue to do all their document work in their existing document authoring tools, the like Microsoft Office product suite.  They can also do the PLM activities – including review, approval or version or effectivity tracking, etc – directly from the same Office products.   With users continuing to work with document tools in which they are already proficient, they become more productive and the learning curve involved with a new PLM tool is eliminated. This helps with easy user adoption of the solution without any formal training requirements. […]

There is an excellent story in leadership consulting lore. I’m not sure how true it is, but the lessons derived from it are incredibly valuable.

There was once a detachment of Hungarian soldiers that struck out on a reconnaissance mission from their platoon in the Alps. While they were out, there was a massive snowstorm and the soldiers lost their way – returning was impossible.  The team was worried; they were not prepared for an extended stay out in these harsh conditions, and even if they had been, how would they get back with no knowledge of their location? They had all but given up hope when one soldier, while rummaging through his uniform, found a map. He showed it to the group and a new =found sense of hope came over them. They rallied together, found shelter, and waited out the storm.

After a couple of days, the blizzard finally let up. Wearily, the soldiers set about returning to their platoon. Using the map, they identified various features of the land, and made their way back. Their commander was elated to see them alive and well. When he asked the team how they did it, the soldier showed the commander the map that had not only guided them back, but had also given them the hope to persevere.  Confused, the commander asked this soldier, “How on earth did you find your way using a map of the Pyrenees?”

This story teaches us many things; here are two:

  • Fear and anxiety can lead people to inaction, even to their own detriment (and the effect usually intensifies in groups)
  • Even with the wrong strategy or plan, the chances of success are higher than if there were no plan at all

The second point has many application in the business world.  One I think of most, in terms of our manufacturing customers, is that of their shop floors.  Often manufacturers, especially small and medium sized ones, don’t have a chance to get deep into process planning.  Stations are haphazardly placed, too many or not enough activities are scheduled at stations, new machinery is placed wherever it fits, etc.  All of this causes bottlenecks and a slower time getting things out the door.  As we all know, time is money – especially in manufacturing, where every lost minute, hour, or day translates into lost revenue.

Tata Technologies has an amazing team of technical experts and works with many solution providers that can help manufacturers find their own map. One of the maturity benchmarks we offer is for the “Digital Factory;” contact us to schedule yours.

 

This post was originally written in January of 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Electron Beam Freeform Fabrication (EBF³) .

What is Electron Beam Freeform Fabrication?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques. Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in a 2D cross-section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects. The most common, and the first, technology of this type to be developed is Fused Deposition Modeling.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement by allowing other companies to commercialize the technology.

Electron Beam Freeform Fabrication, or EBF³ is one of the newest forms of rapid prototyping. This technique is performed with a focused electron beam and a metal wire or filament. The wire is fed through the electron beam to create a molten pool of metal. The material solidifies instantaneously once the electron beam passes through, and is able to support itself (meaning support structures generally aren’t required). This entire process must be executed under a high vacuum.

Pioneered by NASA Langley Research Center, this process is capable of producing incredibly accurate parts at full density (other additive manufacturing techniques have trouble achieving, or require secondary operations to achieve similar results). This is also one of the only techniques that can be successfully performed in zero gravity environments.

What Are the Advantages of this Process? […]

Are you faced with a complex data migration or translation? Do you have years of legacy data that needs to be migrated to a new system? Have you got old CAD data from a outdated system that is still being used?

If you have answered yes to any of these questions, you are facing the prospect of performing a migration or translation project. Here are 10 potential problems that you must look out for before starting:

  1.  Underestimation of effort – too many projects are underestimated, primarily because the use cases for the translation are thought to be simpler then they actually are. For example, assemblies only need translation until someone remembers that drawings need to be included.
  2.  “Everything” syndrome – Looking at a project, most organizations default to attempting to translate or migrate everything. In all cases, this is not necessary, as only a subset of the data is really relevant. Making this mistake can drive up both cost and complexity dramatically.
  3.  Duplicate data – of everything that needs to be moved, how much of it is duplicate data (or same data in slightly different forms)? Experience shows that duplicate data percentages can be as high as 20 to 30 %. Unfortunately, identifing these duplicates can be difficult, but there are techniques to overcome this problem
  4.  Accuracy of CAD translation – When looking at 3D CAD translations, how accurate a copy do the translated models need to be relative to the originals? Again, a blanket requirement of “identical” can drive up cost and complexity hugely. Some lesser target (say +- 2 mm) can improve success.
  5.  Data already exists in Target – Some level of informal manual migration may have already occurred. So, when a formal migration is performed, data “clashes” can occur and result in failures or troublesome duplicates.
  6.  Automatic is not always best – Developing an automated migration or translation tool can be costly, if the requirements are multiple. Sometimes, a manual approach is more cost-effective for smaller and simpler cases.
  7.  Data Enrichment – Because the source data was created in an older system, it may not have all the properties and data that the target system requires. In this case, these have to be added during the migration or translation process. Forgetting about this step will prevent users from accurately finding data later.
  8.  Loss of Data – For large data volumes, is it possible that some of the data is missed and deleted during the project? Very possible – to prevent this requires exhaustive testing and planning.
  9.  Archive Solution – Once the translation or migration is complete, what happens to the original data? In some cases it is possible to delete it. However, in some environments (e.g. regulatory situations) this may not be allowed. In such a case, has an archive solution been put in place?
  10.  Security – Legacy data may be subject to security (ITAR, competitive data, etc.). Does the migration or translation process expose sensitive information to unauthorized users? Often a process will take the data out of its protected environment. This problem has to be considered and managed.

Ask these questions before translations and migrations begin!

This post was originally created in January 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Electron Beam Melting (EBM).

What is Electron Beam Melting?

It is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular based additive manufacturing techniques start with a bed of powdered material. A laser beam or bonding agent joins the material in a cross section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering.

The Selective Laser Sintering Technique was developed in the mid-1980s by Dr. Carl Deckard and Dr. Joseph Beaman and the University of Texas at Austin, under DARPA sponsorship. As a result of this, Deckard and Beaman established the DTM Corporation with the explicit purpose of manufacturing SLS machines, and in 2001 DTM was purchased by their largest competitor, 3D systems.

Electron Beam Melting is very similar to Selective Laser Melting, though there are a few distinct differences. EBM uses an electron beam to create a molten pool of material, to create cross-sections of a part. The material solidifies instantaneously once the electron beam passes through it. In addition, this technique must be performed in a vacuum. This is one of the few additive manufacturing techniques that can create full density parts.

What Are the Advantages of this Process?

EBM is quick; it’s one of the fastest rapid prototyping techniques (though, relatively speaking, most techniques are fast). In addition, it can potentially be one of the most accurate rapid prototyping processes, the major limiting factor being the particle size of the powdered material.

As mentioned previously, this is one of the only additive manufacturing techniques that yields full-density parts; this means parts created with EBM will have similar properties to parts created using traditional manufacturing processes.

Another advantage of the material bed is the ability to stack multiple parts into the build envelope. This can greatly increase the throughput of an EBM machine.

What Are the Disadvantages of this Process? […]

This post was originally created in January 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Selective Laser Melting (SLM).

What is Selective Laser Melting?

It is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular based additive manufacturing techniques start with a bed of a powdered material. A laser beam or bonding agent joins the material in a cross-section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross-section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering.

The Selective Laser Sintering Technique was developed in the mid-1980s by Dr. Carl Deckard and Dr. Joseph Beaman and the University of Texas at Austin, under DARPA sponsorship. As a result of this, Deckard and Beaman established the DTM Corporation with the explicit purpose of manufacturing SLS machines; in 2001, DTM was purchased by their largest competitor, 3D Systems.

SLM is a similar process to SLS, though there are some important differences. Instead of the substrate being sintered, it is melted to fuse layers together. This is typically done in a chamber with an inert gas (usually Nitrogen or Argon), with incredibly low levels of oxygen (below 500 parts per million). This is to prevent any unwanted chemical reactions when the material changes its physical state. This technique yields higher density parts than any sintering process.

What Are the Advantages of this Process?

SLM is quick; it is one of the fastest rapid prototyping techniques (hough, relatively speaking, most techniques are fast). In addition, it can potentially be one of the most accurate rapid prototyping processes, the major limiting factor being the particle size of the powdered material.

As mentioned previously, this technique yields higher density parts than other additive manufacturing techniques, making for a much stronger part.

Another advantage of the material bed is the ability to stack multiple parts into the build envelope. This can greatly increase the throughput of a DMLS machine.

What Are the Disadvantages of this Process? […]

Read Part 1 here.

So, what does a structured process to data migration and translation look like?

First a few definitions:

  • Source system – the origin of the data that needs to be translated or migrated. This could be a database or a directory structure.
  • Target system – the final destination for the data. On completion of the process, data in the target should be in the correct format.
  • Verify – Ensure that data placed in the target system is complete, accurate, and meets defined standards.
  • Staging area – an interim location where data is transformed, cleaned, or converted before being sent to the target.

The process consists of five steps as shown below:

picture1The process can be described as follows:

  • Data to be migrated is identified in the source system. This is an important step and ensures that only relevant data is moved. Junk data is left behind.
  • The identified data is extracted from the source system and placed in the staging area.
  • The data is then transformed into a format ready for the target system. Such transformation could be a CAD to CAD translation, a metadata change, or a cleaning process. Transformation may also entail data enrichment – for example, append additional properties to the objects so they can be better found in the target system.
  • Transformed data is then loaded into the target system. This can be done automatically via programs or manually, dependent on the chosen method. Automatic routines can fail and these are flagged for analysis and action.
  • Once data is loaded, validation is carried out to ensure that the migrated data is correct in the target system and not corrupted in some fashion.

The process as described above is shown at working level:

picture2

Shown in this diagram are two software tools – extractors and loaders. These are usually custom utilities that use APIs, or hooks into the source and target systems, to move the identified data. For example, an extractor tool may query a source PLM system for all released and frozen data that was released after a given date. Once this search is complete, the data identified by this will be downloaded by the extractor from the PLM system into the staging area.

In a similar manner, a loader will execute against a correct data set in the staging area and insert this into a target system, creating the required objects and adding the files.

It is highly recommended that pilot migrations be carried out on test data in developmental environments to verify the process. This testing will identify potential bugs and allow them to be fixed before actual data is touched.

Such a structured process will guarantee success!

My last post outlined the significance of Product Cost Management (PCM) for OEMs and Suppliers to drive profitability and continuous improvement throughout the entire supply chain.

Ideally, PCM needs to be done early in the product development cycle, as early as the conceptual phase – design and supplier selection is much more flexible early in the process – so it is important to enable cost engineering during the front end of product development and ensure profitability with control over costs for parts and tooling.

Not everyone can optimize cost early, or not in all situations. PCM processes and tools may also need to be applied in later stages of the product lifecycle. Even when cost models and consultation based on facts get applied early in the lifecycle, there might be a need to do it several times over the lifecycle, so PCM needs to support the cost model across all corporate functions from product development to sales and establish a single consistent repository for estimating and communicating cost with repeatable processes and historical information. As PCM is spread over the product lifecycle, it’s important to take an enterprise-wide approach to costing. An ideal PCM system needs to align with the product development process managed in a PLM system, so there is lot of synergy between a PLM and PCM.

The most commonly used tools for PCM – spreadsheets and custom programs that conduct simple rollups – are not suitable for enterprise-class wide processes; these solutions do not provide the details required to develop credible cost models. They also make it very difficult for designers to compare products, concepts, and scenarios. Spreadsheets fail due to quality problems and the inability to implement them effectively on an enterprise scale, resulting in different product lines, geographies, or lines of business having different approaches. Non-enterprise approaches also make it difficult to reuse information or apply product changes, currency fluctuations, burden rates updates, or commodity cost changes

By extending an enterprise wide system like PLM for PCM functions, cost management is effectively communicated and captured to institutionalize it for future product programs.  This eliminates disconnected and inconsistent manual costing models, and complex difficult to maintain spreadsheets.  This also supports easy, fast, and reliable impact analysis to incorporate product changes accurately into costs with visibility to all cost factors and make these processes repeatable. The PCM process can also leverage the existing 3D model parametric data managed in PLM systems to extract the relevant parameters such as thickness, surface, and volume for the feature based calculations. Other PLM data that can be reused for PCM includes labor rates from engineering project management, material costs from material management modules, bill of materials/process and tooling involved with engineering and manufacturing data management. An integrated PLM and PCM solution is also important for efficiency and allowing companies to reuse both product data and cost models to facilitate continuous improvement over time .

In the next post of this series, I explain how the Siemens PLM Teamcenter suite supports PCM.

This post was originally created in January 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Powdered Bed & Inkjet 3D Printing (3DP).

What is Powdered Bed & Inkjet 3D Printing?

It is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular based additive manufacturing techniques start with a bed of a powdered material. A laser beam or bonding agent joins the material in a cross section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering, though the main point of discussion here is Powdered Bed & Inkjet 3D Printing.

Invented in 1993 at Massachusetts Institute of Technology, it was commercialized by Z Corporation in 1995. This technology uses a powdered material, traditionally a plaster or starch, and is held together with a binder.  More materials are available now, such as calcium carbonate and powdered Acrylic.

Though 3DP is a granular (or powder) based technique, it does not use a laser to create a part. Instead, a glue or binder serves to join the part. It is also worth mentioning that this type of technique is where the term 3D Printing originated from, as it uses an Inkjet style printing head.

What Are the Advantages of this Process?

This process is one of the few Rapid Prototyping Techniques that can produce fully colored parts, through the integration of inks in the binders.

In addition, the material costs for this particular technique are relatively low, due to their wide commercial availability.

Because parts are created in a bed of material, there is no need to use support structures, like in other forms of rapid prototyping. This helps to prevent secondary operations and machining.

Another advantage of the material bed is the ability to stack multiple parts into the build envelope. This can greatly increase the throughput of a 3DP machine.

What Are the Disadvantages of this Process? […]

Standing on the beach, overlooking the bountiful, yet imperfect, harvest, he pondered the situation in front of him. “Why are all of my troop mates eating these sand-covered sweet potatoes? In the beginning, they were delicious…and without the sand. Now? these wonderful treats are all but inedible. What if I…

This is the beginning of tale based on a scientific research project, though may have evolved into something of an urban legend. The idea is that scientists in Japan, circa 1952, were studying the behaviors of an island full of macaque monkeys. At first, the scientists gave the monkeys sweet potatoes. After a period of time, the scientists then started covering the sweet potatoes in sand to observe how they would react. Not surprisingly, the monkeys still ate the treats, however begrudgingly. Then, the story goes, a young monkey took the vegetable to the water and washed it off. He discovered that it tasted as good as it did before the sand. Excitedly the young monkey showed this discovery to his mother. Approvingly, his mother began washing hers in the water as well.

Still, the vast majority still went on, crunching away on their gritty meals. Over time, a few more monkeys caught on. It wasn’t until a magic number of monkeys were doing this – we’ll say the 100th – that seemingly the entire troop of monkeys began rinsing their sweet potatoes off in the water.

Call it what you will – social validation, the tipping point, the 100th monkey effect, etc. It all comes down the idea that we may not try something new, however potentially beneficial, until it’s “OK” to do so. Cloud solutions for PLM could be coming to that point.  These products have been in the market for a few years now, and they mature with every update (and no upgrade headaches, either).

In the near future, it is forecasted that “Within the next three years, organizations have the largest plans to move data storage/data management (43%) and business/data analytics (43%) to the cloud,” as reported by IDG Enterprise in their “2016 IDG Enterprise Cloud Computing Survey.”  Another survey, “2017 State of the Cloud Survey” by Rightscale, is seeing that overall challenges to adopting cloud services have declined. One of the most important matters, security, has fallen from 29% of respondents reporting it as a concern to 25%. Security is still a valid concern, though I think the market is starting to trust the cloud more and more.

With our experience and expertise with PLM solutions in the cloud, Tata Technologies can help you chose if, when, and how a cloud solution could be right for your company. Let us know how we can help.

© Tata Technologies 2009-2015. All rights reserved.