Category "Product Lifecycle Management"

Any complete FEA solution has at-least three mandatory components: Pre-Processor, solver and post-processor. If you compare it with an automobile, solver is the engine that has all the steps/solution sequences to solve the discretized model. It can be regarded as the main power source of a CAE system. The pre-processor is a graphical user interface that allows user to define all the inputs into the model such as geometry, material, loads and boundary scenarios etc. In our automobile analogy, pre-processor can be regarded as the ignition key without which it is not possible to utilize the engine (solver) efficiently. The post-processor is a visualization tool to make certain conclusion from requested output: either text or binary. A good CAE workflow is regarded as one that offers closed loop CAD to CAD data transfer.

The above workflow is not closed so there is no scope of model update. Any changes in design requires all the rework. This has been the traditional workflow in organizations that have completely disconnected design and analysis departments. Designers send the CAD data to analysts who perform FEA in specialized tools and submit the product virtual performance report back to designers. If a change is mandatory, FEA is performed manually all over again. Let’s look at a better workflow.

In this workflow, if the initial design does not meet the design requirements, it is updated and sent to the solver, not to the pre-processor. It means that all the pre-processing steps are mapped from old design to new design without any manual intervention. This is an effort to bridge the gap between design and analysis departments that has been embraced by the industry so far. The extent to which the GAP can be bridged depends on the chosen workflow but to some extent, almost every CAE company has taken an initiative to introduce products that bridge this GAP. Let’s discuss in context of Dassault Systemes and Siemens.

Dassault Systemes: After acquiring Abaqus Inc in 2005, Dassault Systemes rebranded it as SIMULIA with the objective of giving users access to simulation capabilities without requiring the steep learning curve of disparate, traditional simulation tools. They have been introducing new tools to meet this objective.

  • The first one in series was Associative interfaces for CATIA, Pro-E and Solidworks which is a plug-in to Abaqus CAE. With this plug-in it is possible to automatically transfer the updated data from above mentioned CAD platforms to Abaqus CAE with a single click. All the CAE parameters in Abaqus CAE are mapped from old design to updated design. It’s a nice way to reduce re-work but design and simulation teams are still separate in this workflow.
  • Next initiative was SIMULIA V5 in which Abaqus was introduced in CATIA V5 as a separate workbench. This workbench includes additional toolbars to define Abaqus model and generate Abaqus input file from within CATIA. Introduce Knowledge ware, and user has all the nice features to perform DOE’s and parametric studies. This approach brings designers and analysts with CATIA experience under one roof.
  • Next Dassault Systemes introduced SIMULIA on 3D Experience platform allowing analysts to utilize data management, process management and collaboration tools with Abaqus in the form of simulation apps and roles. The solution is now in a mature stage with incorporation of process optimization, light weight optimization, durability and advanced CFD tools. By merging SIMULIA with BIOVIA we are also talking about multi scale simulation from system to molecular level. It is further possible to perform the simulation and store the data on public or private cloud.

Siemens PLM solutions: Siemens traditional CAE tools include FEMAP user interface and NX Nastran solver. Both have been specialized tools primarily meant for analysts with little or no connectivity to CAD. More specialized and domain specific tools were added with the acquisition of LMS and Mentor Graphics.

  • In 2016 Siemens introduced its new Simulation solutions portfolio called as Simcenter that includes all Siemens simulation capabilities that can be integrated with NX environment. The popular pre-processor in Simcenter series is NX CAE that has bi-directional associativity with NX CAD. Though meant for specialists, NX CAE offers a closed loop workflow between NX CAD and NX Nastran thus making easier to evaluate re-designs and perform DOE’s.
  • Siemens also offers NX CAE add-on environments for Abaqus and Ansys thereby allowing analysis to efficiently incorporate these solvers in their NX design environment.
  • It is further possible to use Simcenter solutions with Siemens well known PLM solution Teamcenter for enterprise wide deployment of Siemens simulation tools.

This shift in approach is not limited to Dassault Systemes and Siemens. Every organization in this space be it Ansys, Autodesk or Altair are introducing such closed form solutions. One reason may be the recent acquisition of many CAE companies by bigger organizations such as Dassault, Siemens and Autodesk. Nevertheless, the change has been triggered and it will continue.

 

 

According to a PLM Foresight Opinion Poll conducted by CIMData, 70% of Senior Executives in manufacturing companies saw little or no value in PLM. This is a troubling statistic as it shows that PLM is not as widely adopted and embraced as it should be. PLM can bring huge efficiency gains to an organization and prevent a lot of errors.

How can you get efficiency from PLM?  One approach is to use a Maturity Assessment. These models investigate issues related to best practices of the system been evaluated using defined “pillars” of major functionality. When their maturity of a given “pillar” is evaluated and measured, this provides current and future capability levels and can be used to identify potential improvements and goals for the system been evaluated. When a maturity model is applied to and shared by a number of organizations in a particular industry, it can provide an industry-specific benchmark against which to evaluate an organization’s maturity with respect to others in the same industry.

So why should a company assess the maturity of their PLM?  This assessment can guide companies to a PLM roadmap that will enable them to improve the state of their current sytem. The roadmap will allow them to deploy appropriate technologies, processes, and organizational changes that enhance the overall product development process from early concept through product end of life. This in turn leads to improved bottom-line and top-line benefits.

Tata Technologies have developed PLM Analytics as a framework to provide information to customers about the state of PLM within their enterprise, the maturity (ability to adopt and deploy) of PLM, and ultimately to the creation of a detailed PLM roadmap that will support their business strategy and objectives. Each component builds on and complements the other components, but can be conducted independently.

What is PLM Analytics?  A high level diagram is shown below:

PLM Benchmark  Helps triage your PLM needs and find out how you stack up against the competition. The resulting report benchmarks performance against 17 industry-standard “pillars” and evaluates the current state and desired future state with an indication of implementation priority. The Benchmark is a consultant-led personal interview with one or more key business leaders. It is toolset agnostic.

PLM Impact Builds a business case and demonstrates how PLM can save you money. Once a PLM Benchmark has been completed, a consultant-led series of interviews with multiple key business leaders can identify multiple savings opportunities. These opportunities are summarized as financial metrics, Payback and ROI. These can be used to provide validation of proposed PLM initiatives and provide decisions makers with key financial data.

PLM Healthcheck  Understand how your current PLM works from the people who use it. The healthcheck surveys a cross-section of your product design team via online assessments to establish PLM health related to organization, processes, and technology. The results identify gaps against best practices, consistency of organizational performance, and prioritize areas of improvement. The healthcheck can be used on a global basis.

PLM Roadmap  A 360°view of your PLM plus a detailed roadmap for pragmatic implementation. The roadmap is constructed from onsite interviews with senior leaders, middle management and end users across product development. These sessions focus on the specific business processes and technologies to be improved and result in a PLM Roadmap, an actionable improvement plan with defined activities and internal owners to ensure successful implementation

By using this suite of tools, Tata Technologies can put you on the road to PLM success!

You can’t get away from it; IoT, the Internet of Things.  Connected devices everywhere.  It is estimated that there will be over 50 billion connected devices by 2020.  At this rate, our toasters will be connected to the internet! From your smartphone, you can control the heat level, monitor temperature and look for cool spots, watch the carbohydrates carbonize in real time!

While we may not actually see the connected toaster (at least I hope not), many companies are looking into their own IoT strategies.

Manufacturers are no different.  Industrial IoT (IIoT) is helping manufacturers learn more about their own products.  All this information is helping to create brand new business models, selling outcomes instead of products.  GE is probably the most notable example of this, where they sell flight hours instead of jet engines.  With an immense amount of data generated by sensors on these engines, GE is able to predict failures and analyze performance to fine tune operation.

The question now, is not how can a company utilize this type of operational feedback, but when.  Manufacturers will have to understand how that data fits into their systems engineering.  As of now, this would require a company to apply the digital thread and digital twin philosophy.  In short, the digital twin first accounts for the PLM’ish information leading up to delivery (requirements, simulations, designs, manufacturing processes, etc.).  Then, each physical product will have an instance of the digital twin to record what happens after delivery.  This includes as-built conditions, service modifications, and, here is the IoT connection, operational parameters.

The market may not yet be ready for this level of digitalization (a term I first heard here, at least in this context).  So many companies are still using shared drives and spreadsheets as the PLM tools of choice.  What we discovered is that if a company wants to have a IoT strategy, they have to understand what that entails.  Will the company be ready to change their culture fast enough?  IIoT is a great feedback loop.  However, there has to be something to feedback all that data to.  The promise is there; I think people understand the benefits.

To answer the original, I think PLM may be ready for it, as it matures and incorporates systems engineering, simulation-in-the-loop, and other enabling technologies.  However, I don’t think the market is ready for the marriage of IIoT and PLM.

In a world of controlled, strict, non-flexible systems, people start to get creative.  For some, it’s the crushing weight of a massively customized ERP system that somehow spread out to every part of the organization, for others circumventing “inconvenient” safety devices; when things need to get done, sometimes we have to take matters into our own hands.  In the IT world, this is called “Shadow IT,” which is basically any app, software, or program that isn’t under the control (or even known to) the IT department. Users downloading freeware, buying their own software, using cloud services, etc. Even NASA has difficulty reining in their employees when it comes to using non-sanctioned software.

This behavior extends into the design and engineering office as well, perhaps moreso than other parts of an average organization. It’s in their nature to solve problems; it’s kind of the key attribute of an engineering job description! I can understand why – engineers live and breathe efficiency, and being over-encumbered by poorly designed systems is not efficient at all.

Case in point: the Bill of Materials (BOM). How many systems are controlling it? ERP? The design tool? PDM?  Some home-grown custom system?  By and large…no.  Most work-in-process BOMs are done in spreadsheets.  And why not?  Spreadsheet applications are easy to use and share, don’t require much training, and don’t require six levels of approval to implement. Spreadsheets can even be automated with macros, making BOMs configurable and intelligent. Eventually all items do end up in the ERP, though typically not until late in a product’s/project’s/program’s lifecycle.

So, why not stay with the spreadsheets? What if someone edits those macros and an unknowing user doesn’t verify the end result? What if the file gets corrupted? How does the rest of the organization gain visibility to the latest changes? What ensures that the BOM is up to date with the design? Ultimately, the BOM should be controlled in a PLM system, much to chagrin of clever engineers everywhere.  Here’s why.

Just as when the market moved from pencil drawings, french curves, vellums, and other manual drafting techniques to 2D CAD tools, and similarly 2D design to 3D modeling: Change.  Yes, sketching a part is faster than creating the lines and circles – but CAD technology enables updates caused by change much faster than a manual process. The gains of going from 2D designs to 3D models are more staggering.  “You mean one edit and it updates everywhere?  Even the drawing?”  Anecdotally, I had a customer say “With 2D, whenever the shop called, I was worried about what we messed up.  Now, with 3D models, when the shop calls, I worry about what they messed up.

Again, it’s about rate of change, propagation and validation of the change. Spreadsheets cannot do that (unless you have some really wicked cool macros).

With our PLM Analytics Benchmark, we can help your company to assess the nature of your BOM needs, as well as the 16 pillars of PLM. Let us know if we can be of service!

Are you faced with a complex data migration or translation? Do you have years of legacy data that needs to be migrated to a new system? Have you got old CAD data from a outdated system that is still being used?

If you have answered yes to any of these questions, you are facing the prospect of performing a migration or translation project. Here are 10 potential problems that you must look out for before starting:

  1.  Underestimation of effort – too many projects are underestimated, primarily because the use cases for the translation are thought to be simpler then they actually are. For example, assemblies only need translation until someone remembers that drawings need to be included.
  2.  “Everything” syndrome – Looking at a project, most organizations default to attempting to translate or migrate everything. In all cases, this is not necessary, as only a subset of the data is really relevant. Making this mistake can drive up both cost and complexity dramatically.
  3.  Duplicate data – of everything that needs to be moved, how much of it is duplicate data (or same data in slightly different forms)? Experience shows that duplicate data percentages can be as high as 20 to 30 %. Unfortunately, identifing these duplicates can be difficult, but there are techniques to overcome this problem
  4.  Accuracy of CAD translation – When looking at 3D CAD translations, how accurate a copy do the translated models need to be relative to the originals? Again, a blanket requirement of “identical” can drive up cost and complexity hugely. Some lesser target (say +- 2 mm) can improve success.
  5.  Data already exists in Target – Some level of informal manual migration may have already occurred. So, when a formal migration is performed, data “clashes” can occur and result in failures or troublesome duplicates.
  6.  Automatic is not always best – Developing an automated migration or translation tool can be costly, if the requirements are multiple. Sometimes, a manual approach is more cost-effective for smaller and simpler cases.
  7.  Data Enrichment – Because the source data was created in an older system, it may not have all the properties and data that the target system requires. In this case, these have to be added during the migration or translation process. Forgetting about this step will prevent users from accurately finding data later.
  8.  Loss of Data – For large data volumes, is it possible that some of the data is missed and deleted during the project? Very possible – to prevent this requires exhaustive testing and planning.
  9.  Archive Solution – Once the translation or migration is complete, what happens to the original data? In some cases it is possible to delete it. However, in some environments (e.g. regulatory situations) this may not be allowed. In such a case, has an archive solution been put in place?
  10.  Security – Legacy data may be subject to security (ITAR, competitive data, etc.). Does the migration or translation process expose sensitive information to unauthorized users? Often a process will take the data out of its protected environment. This problem has to be considered and managed.

Ask these questions before translations and migrations begin!

Read Part 1 here.

So, what does a structured process to data migration and translation look like?

First a few definitions:

  • Source system – the origin of the data that needs to be translated or migrated. This could be a database or a directory structure.
  • Target system – the final destination for the data. On completion of the process, data in the target should be in the correct format.
  • Verify – Ensure that data placed in the target system is complete, accurate, and meets defined standards.
  • Staging area – an interim location where data is transformed, cleaned, or converted before being sent to the target.

The process consists of five steps as shown below:

picture1The process can be described as follows:

  • Data to be migrated is identified in the source system. This is an important step and ensures that only relevant data is moved. Junk data is left behind.
  • The identified data is extracted from the source system and placed in the staging area.
  • The data is then transformed into a format ready for the target system. Such transformation could be a CAD to CAD translation, a metadata change, or a cleaning process. Transformation may also entail data enrichment – for example, append additional properties to the objects so they can be better found in the target system.
  • Transformed data is then loaded into the target system. This can be done automatically via programs or manually, dependent on the chosen method. Automatic routines can fail and these are flagged for analysis and action.
  • Once data is loaded, validation is carried out to ensure that the migrated data is correct in the target system and not corrupted in some fashion.

The process as described above is shown at working level:

picture2

Shown in this diagram are two software tools – extractors and loaders. These are usually custom utilities that use APIs, or hooks into the source and target systems, to move the identified data. For example, an extractor tool may query a source PLM system for all released and frozen data that was released after a given date. Once this search is complete, the data identified by this will be downloaded by the extractor from the PLM system into the staging area.

In a similar manner, a loader will execute against a correct data set in the staging area and insert this into a target system, creating the required objects and adding the files.

It is highly recommended that pilot migrations be carried out on test data in developmental environments to verify the process. This testing will identify potential bugs and allow them to be fixed before actual data is touched.

Such a structured process will guarantee success!

My last post outlined the significance of Product Cost Management (PCM) for OEMs and Suppliers to drive profitability and continuous improvement throughout the entire supply chain.

Ideally, PCM needs to be done early in the product development cycle, as early as the conceptual phase – design and supplier selection is much more flexible early in the process – so it is important to enable cost engineering during the front end of product development and ensure profitability with control over costs for parts and tooling.

Not everyone can optimize cost early, or not in all situations. PCM processes and tools may also need to be applied in later stages of the product lifecycle. Even when cost models and consultation based on facts get applied early in the lifecycle, there might be a need to do it several times over the lifecycle, so PCM needs to support the cost model across all corporate functions from product development to sales and establish a single consistent repository for estimating and communicating cost with repeatable processes and historical information. As PCM is spread over the product lifecycle, it’s important to take an enterprise-wide approach to costing. An ideal PCM system needs to align with the product development process managed in a PLM system, so there is lot of synergy between a PLM and PCM.

The most commonly used tools for PCM – spreadsheets and custom programs that conduct simple rollups – are not suitable for enterprise-class wide processes; these solutions do not provide the details required to develop credible cost models. They also make it very difficult for designers to compare products, concepts, and scenarios. Spreadsheets fail due to quality problems and the inability to implement them effectively on an enterprise scale, resulting in different product lines, geographies, or lines of business having different approaches. Non-enterprise approaches also make it difficult to reuse information or apply product changes, currency fluctuations, burden rates updates, or commodity cost changes

By extending an enterprise wide system like PLM for PCM functions, cost management is effectively communicated and captured to institutionalize it for future product programs.  This eliminates disconnected and inconsistent manual costing models, and complex difficult to maintain spreadsheets.  This also supports easy, fast, and reliable impact analysis to incorporate product changes accurately into costs with visibility to all cost factors and make these processes repeatable. The PCM process can also leverage the existing 3D model parametric data managed in PLM systems to extract the relevant parameters such as thickness, surface, and volume for the feature based calculations. Other PLM data that can be reused for PCM includes labor rates from engineering project management, material costs from material management modules, bill of materials/process and tooling involved with engineering and manufacturing data management. An integrated PLM and PCM solution is also important for efficiency and allowing companies to reuse both product data and cost models to facilitate continuous improvement over time .

In the next post of this series, I explain how the Siemens PLM Teamcenter suite supports PCM.

Standing on the beach, overlooking the bountiful, yet imperfect, harvest, he pondered the situation in front of him. “Why are all of my troop mates eating these sand-covered sweet potatoes? In the beginning, they were delicious…and without the sand. Now? these wonderful treats are all but inedible. What if I…

This is the beginning of tale based on a scientific research project, though may have evolved into something of an urban legend. The idea is that scientists in Japan, circa 1952, were studying the behaviors of an island full of macaque monkeys. At first, the scientists gave the monkeys sweet potatoes. After a period of time, the scientists then started covering the sweet potatoes in sand to observe how they would react. Not surprisingly, the monkeys still ate the treats, however begrudgingly. Then, the story goes, a young monkey took the vegetable to the water and washed it off. He discovered that it tasted as good as it did before the sand. Excitedly the young monkey showed this discovery to his mother. Approvingly, his mother began washing hers in the water as well.

Still, the vast majority still went on, crunching away on their gritty meals. Over time, a few more monkeys caught on. It wasn’t until a magic number of monkeys were doing this – we’ll say the 100th – that seemingly the entire troop of monkeys began rinsing their sweet potatoes off in the water.

Call it what you will – social validation, the tipping point, the 100th monkey effect, etc. It all comes down the idea that we may not try something new, however potentially beneficial, until it’s “OK” to do so. Cloud solutions for PLM could be coming to that point.  These products have been in the market for a few years now, and they mature with every update (and no upgrade headaches, either).

In the near future, it is forecasted that “Within the next three years, organizations have the largest plans to move data storage/data management (43%) and business/data analytics (43%) to the cloud,” as reported by IDG Enterprise in their “2016 IDG Enterprise Cloud Computing Survey.”  Another survey, “2017 State of the Cloud Survey” by Rightscale, is seeing that overall challenges to adopting cloud services have declined. One of the most important matters, security, has fallen from 29% of respondents reporting it as a concern to 25%. Security is still a valid concern, though I think the market is starting to trust the cloud more and more.

With our experience and expertise with PLM solutions in the cloud, Tata Technologies can help you chose if, when, and how a cloud solution could be right for your company. Let us know how we can help.

What is data migration and translation? Here is a definition that will help:

  • Information exists in different formats and representations. For example, Egyptian hieroglyphics are a pictorial language (representation) inscribed on stone (format)
  • However, information is only useful to a consumer in a specific format and representation. So, Roman letters printed on paper may mean the same as an equivalent hieroglyphic text, but the latter could not be understood by a English reader.
  • Migration moves data between formats – such as stone to paper
  • Translation moves data between representations – hieroglyphics to roman letters

What must a migration and translation achieve?

  • The process preserves the accuracy of the information
  • The process is consistent

In the PLM world, the requirement for data translation and migration arises as the result of multiple conditions. Examples of these include changes in technology (one CAD system to another CAD system), upgrades to software (from one level of a PLM system to later version), combination of data from two different sources (CAD files on a directory system with files in a PDM), acquisitions and mergers between companies (combine product data) and integration between systems (connect PLM to ERP).

However, migrations and translations can be fraught with problems and require considerable effort. Here are some reasons: […]

Everyone knows that a PLM journey can be a long and expensive path, with frustrations at every turn. The question an organization often asks is: is it worth trying to walk that path?

Effective and correctly implemented PLM can significantly impact several business costs, resulting in large organizational savings. Take a look at the list below and consider how your costs look right now – you may be able to answer your own question.

10 Business Costs Directly Impacted by PLM

  1. Factory Rework and Scrap. These costs can be substantial in a manufacturing organization. Not all rework and scrap is caused by insufficient or miscommunicated engineering and design, but a sizeable percentage is traceable back to this root cause. An effective PLM setup will reduce engineering-originated errors by providing timely and accurate information to the factory floor.
  2. Supplier Quality. Getting timely and accurate information to your suppliers can ensure that they deliver quality parts to your production line. PLM correctly configured can make this happen.
  3. Expedited freight costs. How many times does a product get out of your factories late? In order not to incur penalties, the shipping is expedited at a huge premium. Can any of these incidents be traced back to delayed engineering data? Then a PLM system can help.
  4. Effort to process bids. To win business, you need to respond to RFQs by preparing bids. This effort does not directly generate revenue, and so the preparation process must be as streamlined as possible. Are your key people distracted by bids? Automating the process with a PLM system will reduce the effort required.
  5. Time to create reports. Management requires reports that need to be reviewed. Are these created manually from disparate sources? Why not use a PLM system to generate these reports automatically on demand? There are huge time savings to be had from this enhancement.
  6. Time preparing data for downstream users. How much time does your valuable engineering resource spend extracting, converting, and transmitting engineering data to downstream users? Hours per week? This cost can be avoided completely by setting up a PLM system to deliver this data with no effort from the engineers.
  7. Effort to process engineering change. Your company struggles to process engineering change requests and notices. Many are late and require multiple rework cycles. A PLM can fix that by automating the process and ensuring accurate information.
  8. Cost of physical prototypes. Do you spend a lot of money on building and testing physical prototypes as part of your design process? Do you have to build them all or could some be eliminated by better engineering tools and virtual simulation? A leading-edge PLM system can reduce this dramatically.
  9. Your suppliers deliver parts that require rework. You are constantly getting incorrect parts from your suppliers. But do your suppliers have the right information to begin with? PLM technology can bridge this gap
  10. Wasted development effort. Do you spend funds developing products that go nowhere? This problem can be addressed by a PLM system that manages your development portfolio more accurately.

Do you have more than three of these costs that concern your or that are out of control? Then you definitely need to take a serious look at implementing or reworking your PLM system. We can help – just let us know.

© Tata Technologies 2009-2015. All rights reserved.