Category "Product Lifecycle Management"

Are you faced with a complex data migration or translation? Do you have years of legacy data that needs to be migrated to a new system? Have you got old CAD data from a outdated system that is still being used?

If you have answered yes to any of these questions, you are facing the prospect of performing a migration or translation project. Here are 10 potential problems that you must look out for before starting:

  1.  Underestimation of effort – too many projects are underestimated, primarily because the use cases for the translation are thought to be simpler then they actually are. For example, assemblies only need translation until someone remembers that drawings need to be included.
  2.  “Everything” syndrome – Looking at a project, most organizations default to attempting to translate or migrate everything. In all cases, this is not necessary, as only a subset of the data is really relevant. Making this mistake can drive up both cost and complexity dramatically.
  3.  Duplicate data – of everything that needs to be moved, how much of it is duplicate data (or same data in slightly different forms)? Experience shows that duplicate data percentages can be as high as 20 to 30 %. Unfortunately, identifing these duplicates can be difficult, but there are techniques to overcome this problem
  4.  Accuracy of CAD translation – When looking at 3D CAD translations, how accurate a copy do the translated models need to be relative to the originals? Again, a blanket requirement of “identical” can drive up cost and complexity hugely. Some lesser target (say +- 2 mm) can improve success.
  5.  Data already exists in Target – Some level of informal manual migration may have already occurred. So, when a formal migration is performed, data “clashes” can occur and result in failures or troublesome duplicates.
  6.  Automatic is not always best – Developing an automated migration or translation tool can be costly, if the requirements are multiple. Sometimes, a manual approach is more cost-effective for smaller and simpler cases.
  7.  Data Enrichment – Because the source data was created in an older system, it may not have all the properties and data that the target system requires. In this case, these have to be added during the migration or translation process. Forgetting about this step will prevent users from accurately finding data later.
  8.  Loss of Data – For large data volumes, is it possible that some of the data is missed and deleted during the project? Very possible – to prevent this requires exhaustive testing and planning.
  9.  Archive Solution – Once the translation or migration is complete, what happens to the original data? In some cases it is possible to delete it. However, in some environments (e.g. regulatory situations) this may not be allowed. In such a case, has an archive solution been put in place?
  10.  Security – Legacy data may be subject to security (ITAR, competitive data, etc.). Does the migration or translation process expose sensitive information to unauthorized users? Often a process will take the data out of its protected environment. This problem has to be considered and managed.

Ask these questions before translations and migrations begin!

Read Part 1 here.

So, what does a structured process to data migration and translation look like?

First a few definitions:

  • Source system – the origin of the data that needs to be translated or migrated. This could be a database or a directory structure.
  • Target system – the final destination for the data. On completion of the process, data in the target should be in the correct format.
  • Verify – Ensure that data placed in the target system is complete, accurate, and meets defined standards.
  • Staging area – an interim location where data is transformed, cleaned, or converted before being sent to the target.

The process consists of five steps as shown below:

picture1The process can be described as follows:

  • Data to be migrated is identified in the source system. This is an important step and ensures that only relevant data is moved. Junk data is left behind.
  • The identified data is extracted from the source system and placed in the staging area.
  • The data is then transformed into a format ready for the target system. Such transformation could be a CAD to CAD translation, a metadata change, or a cleaning process. Transformation may also entail data enrichment – for example, append additional properties to the objects so they can be better found in the target system.
  • Transformed data is then loaded into the target system. This can be done automatically via programs or manually, dependent on the chosen method. Automatic routines can fail and these are flagged for analysis and action.
  • Once data is loaded, validation is carried out to ensure that the migrated data is correct in the target system and not corrupted in some fashion.

The process as described above is shown at working level:

picture2

Shown in this diagram are two software tools – extractors and loaders. These are usually custom utilities that use APIs, or hooks into the source and target systems, to move the identified data. For example, an extractor tool may query a source PLM system for all released and frozen data that was released after a given date. Once this search is complete, the data identified by this will be downloaded by the extractor from the PLM system into the staging area.

In a similar manner, a loader will execute against a correct data set in the staging area and insert this into a target system, creating the required objects and adding the files.

It is highly recommended that pilot migrations be carried out on test data in developmental environments to verify the process. This testing will identify potential bugs and allow them to be fixed before actual data is touched.

Such a structured process will guarantee success!

My last post outlined the significance of Product Cost Management (PCM) for OEMs and Suppliers to drive profitability and continuous improvement throughout the entire supply chain.

Ideally, PCM needs to be done early in the product development cycle, as early as the conceptual phase – design and supplier selection is much more flexible early in the process – so it is important to enable cost engineering during the front end of product development and ensure profitability with control over costs for parts and tooling.

Not everyone can optimize cost early, or not in all situations. PCM processes and tools may also need to be applied in later stages of the product lifecycle. Even when cost models and consultation based on facts get applied early in the lifecycle, there might be a need to do it several times over the lifecycle, so PCM needs to support the cost model across all corporate functions from product development to sales and establish a single consistent repository for estimating and communicating cost with repeatable processes and historical information. As PCM is spread over the product lifecycle, it’s important to take an enterprise-wide approach to costing. An ideal PCM system needs to align with the product development process managed in a PLM system, so there is lot of synergy between a PLM and PCM.

The most commonly used tools for PCM – spreadsheets and custom programs that conduct simple rollups – are not suitable for enterprise-class wide processes; these solutions do not provide the details required to develop credible cost models. They also make it very difficult for designers to compare products, concepts, and scenarios. Spreadsheets fail due to quality problems and the inability to implement them effectively on an enterprise scale, resulting in different product lines, geographies, or lines of business having different approaches. Non-enterprise approaches also make it difficult to reuse information or apply product changes, currency fluctuations, burden rates updates, or commodity cost changes

By extending an enterprise wide system like PLM for PCM functions, cost management is effectively communicated and captured to institutionalize it for future product programs.  This eliminates disconnected and inconsistent manual costing models, and complex difficult to maintain spreadsheets.  This also supports easy, fast, and reliable impact analysis to incorporate product changes accurately into costs with visibility to all cost factors and make these processes repeatable. The PCM process can also leverage the existing 3D model parametric data managed in PLM systems to extract the relevant parameters such as thickness, surface, and volume for the feature based calculations. Other PLM data that can be reused for PCM includes labor rates from engineering project management, material costs from material management modules, bill of materials/process and tooling involved with engineering and manufacturing data management. An integrated PLM and PCM solution is also important for efficiency and allowing companies to reuse both product data and cost models to facilitate continuous improvement over time .

In the next post of this series, I explain how the Siemens PLM Teamcenter suite supports PCM.

Standing on the beach, overlooking the bountiful, yet imperfect, harvest, he pondered the situation in front of him. “Why are all of my troop mates eating these sand-covered sweet potatoes? In the beginning, they were delicious…and without the sand. Now? these wonderful treats are all but inedible. What if I…

This is the beginning of tale based on a scientific research project, though may have evolved into something of an urban legend. The idea is that scientists in Japan, circa 1952, were studying the behaviors of an island full of macaque monkeys. At first, the scientists gave the monkeys sweet potatoes. After a period of time, the scientists then started covering the sweet potatoes in sand to observe how they would react. Not surprisingly, the monkeys still ate the treats, however begrudgingly. Then, the story goes, a young monkey took the vegetable to the water and washed it off. He discovered that it tasted as good as it did before the sand. Excitedly the young monkey showed this discovery to his mother. Approvingly, his mother began washing hers in the water as well.

Still, the vast majority still went on, crunching away on their gritty meals. Over time, a few more monkeys caught on. It wasn’t until a magic number of monkeys were doing this – we’ll say the 100th – that seemingly the entire troop of monkeys began rinsing their sweet potatoes off in the water.

Call it what you will – social validation, the tipping point, the 100th monkey effect, etc. It all comes down the idea that we may not try something new, however potentially beneficial, until it’s “OK” to do so. Cloud solutions for PLM could be coming to that point.  These products have been in the market for a few years now, and they mature with every update (and no upgrade headaches, either).

In the near future, it is forecasted that “Within the next three years, organizations have the largest plans to move data storage/data management (43%) and business/data analytics (43%) to the cloud,” as reported by IDG Enterprise in their “2016 IDG Enterprise Cloud Computing Survey.”  Another survey, “2017 State of the Cloud Survey” by Rightscale, is seeing that overall challenges to adopting cloud services have declined. One of the most important matters, security, has fallen from 29% of respondents reporting it as a concern to 25%. Security is still a valid concern, though I think the market is starting to trust the cloud more and more.

With our experience and expertise with PLM solutions in the cloud, Tata Technologies can help you chose if, when, and how a cloud solution could be right for your company. Let us know how we can help.

What is data migration and translation? Here is a definition that will help:

  • Information exists in different formats and representations. For example, Egyptian hieroglyphics are a pictorial language (representation) inscribed on stone (format)
  • However, information is only useful to a consumer in a specific format and representation. So, Roman letters printed on paper may mean the same as an equivalent hieroglyphic text, but the latter could not be understood by a English reader.
  • Migration moves data between formats – such as stone to paper
  • Translation moves data between representations – hieroglyphics to roman letters

What must a migration and translation achieve?

  • The process preserves the accuracy of the information
  • The process is consistent

In the PLM world, the requirement for data translation and migration arises as the result of multiple conditions. Examples of these include changes in technology (one CAD system to another CAD system), upgrades to software (from one level of a PLM system to later version), combination of data from two different sources (CAD files on a directory system with files in a PDM), acquisitions and mergers between companies (combine product data) and integration between systems (connect PLM to ERP).

However, migrations and translations can be fraught with problems and require considerable effort. Here are some reasons: […]

Everyone knows that a PLM journey can be a long and expensive path, with frustrations at every turn. The question an organization often asks is: is it worth trying to walk that path?

Effective and correctly implemented PLM can significantly impact several business costs, resulting in large organizational savings. Take a look at the list below and consider how your costs look right now – you may be able to answer your own question.

10 Business Costs Directly Impacted by PLM

  1. Factory Rework and Scrap. These costs can be substantial in a manufacturing organization. Not all rework and scrap is caused by insufficient or miscommunicated engineering and design, but a sizeable percentage is traceable back to this root cause. An effective PLM setup will reduce engineering-originated errors by providing timely and accurate information to the factory floor.
  2. Supplier Quality. Getting timely and accurate information to your suppliers can ensure that they deliver quality parts to your production line. PLM correctly configured can make this happen.
  3. Expedited freight costs. How many times does a product get out of your factories late? In order not to incur penalties, the shipping is expedited at a huge premium. Can any of these incidents be traced back to delayed engineering data? Then a PLM system can help.
  4. Effort to process bids. To win business, you need to respond to RFQs by preparing bids. This effort does not directly generate revenue, and so the preparation process must be as streamlined as possible. Are your key people distracted by bids? Automating the process with a PLM system will reduce the effort required.
  5. Time to create reports. Management requires reports that need to be reviewed. Are these created manually from disparate sources? Why not use a PLM system to generate these reports automatically on demand? There are huge time savings to be had from this enhancement.
  6. Time preparing data for downstream users. How much time does your valuable engineering resource spend extracting, converting, and transmitting engineering data to downstream users? Hours per week? This cost can be avoided completely by setting up a PLM system to deliver this data with no effort from the engineers.
  7. Effort to process engineering change. Your company struggles to process engineering change requests and notices. Many are late and require multiple rework cycles. A PLM can fix that by automating the process and ensuring accurate information.
  8. Cost of physical prototypes. Do you spend a lot of money on building and testing physical prototypes as part of your design process? Do you have to build them all or could some be eliminated by better engineering tools and virtual simulation? A leading-edge PLM system can reduce this dramatically.
  9. Your suppliers deliver parts that require rework. You are constantly getting incorrect parts from your suppliers. But do your suppliers have the right information to begin with? PLM technology can bridge this gap
  10. Wasted development effort. Do you spend funds developing products that go nowhere? This problem can be addressed by a PLM system that manages your development portfolio more accurately.

Do you have more than three of these costs that concern your or that are out of control? Then you definitely need to take a serious look at implementing or reworking your PLM system. We can help – just let us know.

My last post outlined how an integrated product lifecycle management (PLM) and service lifecycle management (SLM) tool framework can benefit both Product development organizations (Brand owners) and customers (Asset owners) by  enabling higher quality service at lower cost, resulting in an increased product/asset utilization and productivity. Teamcenter as a leading PLM platform supports this vision. With Teamcenter SLM solutions, the service and support phase of the product lifecycle is included in your overall PLM vision. Teamcenter bridges the gaps between the engineering, logistics, manufacturing, and service communities. OEMs and service providers can drive more efficient service operations with a single source of knowledge for both products and assets

For OEMs, Teamcenter enables them to reuse design and manufacturing data to enhance service content and incorporate service feedback to support Design for Serviceability and other product improvement initiatives. This holistic approach to the full product lifecycle helps the OEM compete successfully in the service market.  Teamcenter unifies SLM with PLM to support bi-directional collaboration between product engineering and service operations. Service teams can capitalize on the re-use of product knowledge from engineering and manufacturing to improve service planning and execution. In return, service teams can provide feedback to engineering to improve product designs for serviceability and reliability.

For the third party service provider, the service data management and applications allow them to efficiently execute service activities in a global marketplace through a single service platform. Using configuration-driven BOM management, Teamcenter delivers a fully linked, full lifecycle BOM environment that includes the EBOM, SBOM (Service BOM), and Asset BOM to configure accurate information to support services. Different service disciplines can share a common understanding of support requirements and  Service teams can coordinate operational activities for greater compliance, faster service, and lower costs.

The highlights of the solution include:

Maximize Service Knowledge Management and Value

With Teamcenter as the core of your SLM strategy, you have one source of service knowledge management. You can perform service activities with a full understanding of physical product /asset configurations, status and service history. You can order the correct parts, ensure that the proper training is done, and access all the appropriate information necessary to manage service operations

Create Effective Service Plans

Service plans are the key to profitable service operations. Teamcenter provides you with the fundamentals to author and publish service documentation as the source of work scope definition. You can drive service operations by providing all the detailed information that teams need to track and understand asset health, such as service requirements, task-by-task procedures, necessary resources and utilization characteristics. Your technicians have a complete understanding of service needs from Teamcenter, so they are prepared to perform reactive, proactive and upgrade service activities

Optimize Service Work with Schedule Visibility

With the detailed service plans in Teamcenter, you can schedule service activities with a complete understanding of the work scope, in order to meet customer expectations for product availability and reliability. Work orders generated from service plans are used to create service schedules. It is the visibility into the schedule and resources provided by Teamcenter that allows you to optimize service events and ensure that the right resources (parts, qualified people and tools) are reserved for the work

Empower Service Technicians with Work Instructions

Service technicians are a limited resource. When you provide them with complete, intelligent work packages, technicians can execute service work efficiently, accurately and compliantly. With Teamcenter, you can deliver service work instructions, safety/hazard notes, and service procedures (text, 2D/3D and animations). You can also include asset configurations and data collection requirements. Technicians can enter data, observations or discrepancies, and digitally sign off on work, which automatically updates the service schedule.

“To specialize or not to specialize, that is the question.”

The question of specializing vs. generalizing has arisen in so many aspects: biology, health, higher education, and of course, software.  When one has to decide between the two ends of the spectrum, the benefits and risks must be weighed.

muskrat_eating_plantAs environments have changed over time, animals have had to make a decision: change or perish. Certain species adapted their biology to survive on plants – herbivores – others, meat 0 carnivores.  When in their preferred environments with ample resources, each can thrive.  However, if conditions in those environments change so that those resources are not as bountiful, they may die out. Then comes the omnivore, whose adaptation has enabled them to survive on either type of resource. With this wider capability of survival, there comes a cost of efficiency. The further you move up through the food chain, the less efficient the transfer of energy becomes.  Plants produce energy, only 10% of which an herbivore derives, and the carnivore that feeds on the herbivore only gets 10% of that 10%; i.e. 1% of the original energy.

Three hundred trout are needed to support one man for a year.
The trout, in turn, must consume 90,000 frogs, that must consume 27 million grasshoppers that live off of 1,000 tons of grass.
— G. Tyler Miller, Jr., American Chemist (1971)

doctor-1149149_640When it comes to deciding on a course of action for a given health problem, people have the option to go to their family doctor, a.k.a. general practitioner, or a specialist. There are “…reams of papers reporting that specialists have the edge when it comes to current knowledge in their area of expertise” (Turner and Laine, “Differences Between Generalists and Specialists“)., whereas the generalist, even if knowledgeable in the field, may lag behind the specialist and prescribe out-of-date – but still generally beneficial – treatments.  This begs the question, what value do we place on the level of expertise?  If you have a life-threatening condition, then a specialist would make sense; however, you wouldn’t see a cardiologist if your heart races after a walk up a flight of stairs – your family doctor could diagnose that you need some more exercise.

graduation-907565_640When it comes to higher education, this choice of specializing or not also exists: to have deep knowledge and experience in few areas, or a shallower understanding in a broad range of applications. Does the computer science major choose to specialize in artificial intelligence or networking? Or none at all? How about the music major?  Specialize in classical or German Polka? When making these decisions, goals should be decided upon first. What is it that drives the person? High salary in a booming market (hint: chances are that’s not German Polka)? Or is the goal pursuing a passion, perhaps at the cost of potential income? Or is it the ability to be valuable to many different types of employers in order to change as the markets do? It’s been shown that specialists may not always command a higher price tag; some employers value candidates that demonstrate they can thrive in a variety of pursuits.

Whether you’re looking to take advantage of specialized design products (for instance, sheet metal or wire harnesses), or gaining the value inherent in a general suite of tools present in a connected PLM platform that can do project management, CAPA, and Bill of Materials management, we have the means. A “Digital Engineering” benchmark can help you decide if specialized tools are right for your company. Likewise, our PLM Analytics benchmark can help you choose the right PLM system or sub-system to implement.

Specialize, or generalize? Which way are you headed and why?

In this era of new levels of globalization, product companies are faced with market pressures from global competition and price deflation. Today they seek alternate sources of profitable revenue growth enabled by value-add service products. Developing a service-based revenue stream and then delivering product service that is both effective and profitable has its own challenges, however. Even mature service organizations are seeking new approaches to reach a significantly higher quality of service delivery.

Today in a typical product company, there is no single application to manage the the data and the decision points required to deliver effective service. Multiple enterprise applications including PLM, ERP, and often a combination of local databases, spreadsheets and stand alone IT systems are involved in service management. This results in fragmented information and knowledge processes around service delivery.

A new approach centered on incorporating service lifecycle management (SLM) as an integral part of product lifecycle management (PLM) is required in order to to achieve significant improvement in service readiness and delivery. First, this approach focuses on making complex products easier and less costly to maintain, and allowing for more effective allocation of service resources. The second key component is managing the complexity of service information that will reduce the cost and time to create and deliver critical service documentation and service records, at the same time improving the quality and efficacy of this information.

With SLM approached as an extended PLM process, design information can be used to bootstrap and enhance service planning, and product changes and updates are directed to modify service work instructions, and field experience provides up-to-date insight into product quality. The bulk of the information required for services such as illustrations, schematics, and work instructions already exists within the engineering organization and can be repurposed with a relatively little effort. 3D CAD models and Bills of Materials can be used to create everything from exploded wireframe views to photorealistic rendering, and to remove and replace animations that help in service execution. Manufacturability and ergonomic simulations can be used to improve the safety and efficiency of repair procedures.

The expanded PLM system needs to act as a centralized repository of the service bill-of-materials (sBoM) along with Engineering & Manufacturing BoM so that service items, which are mostly design and manufacturing items repurposed for service, can be synchronized to reflect the most up-to-date state of information. This synchronization is possible when SLM is part of PLM and shares the same configuration and change management processes

This way, enterprise PLM systems become the digital backbone of the entire product life cycle – including  SLM – and SLM becomes a dynamic process connected with PLM that continues throughout the useful life of the product or asset. This reduces process fragmentation and provides rich end-to end context for better and more profitable service.

The combined PLM and SLM approach, along with new service models based on the latest technologies (such as the Internet of Things), enables brand owners to deliver higher quality service at lower cost, resulting in higher profit margins, enhanced brand image, and greater customer loyalty. Product or asset owners who are the end customers also benefit from increased utilization and productivity due to faster and more reliable service.

What do you think? Is your organization connected?

If you are in the business of designing and engineering product, then you have PLM. This is a statement of fact. The question then becomes: what is the technology underpinning the PLM process that is used to control your designs?

Because of the way that technology changes and matures, most organizations have a collection of software and processes that support their PLM processes. This can be called the Point Solution approach. Consider a hypothetical setup below:

The advantage of this approach is that point solutions can be individually optimized for a given process – so, in the example above, the change management system can be set up to exactly mirror the internal engineering change process.

However, this landscape also has numerous disadvantages:

  1. Data often has to be transferred between different solutions (e.. what is the precise CAD model tied to a specific engineering change?). These integrations are difficult to set up and maintain – sometimes to the point of being manual tasks.
  2. The organization has to deal with multiple vendors.
  3. Multiple PLM systems working together require significant internal support resource from an IT department.
  4. Training and onboarding of new staff is complicated

The alternative to this approach is a PLM Platform. Here, one technology solution includes all necessary PLM functionalities. The scenario is illustrated below:

It is clear that the PLM Platform does away with many of the disadvantages of the Point Solution; there is only one vendor to deal with, integrations are seamless, training is simplified, and support should be easier.

However, the PLM Platform may not provide the best solution for a given function when compared to the corresponding point solution. For example, a dedicated project management software may do a better job at Program Management than the functionality in the PLM Platform; this may require organizational compromise. You are also, to some extent, betting on a single technology vendor and hoping that they remain an industry leader.

Some of the major PLM solution vendors have placed such bets on the platform strategy. For example, Siemens PLM have positioned Teamcenter as a complete platform solution covering all aspects of the PLM process. (refer to my earlier blog post What is Teamcenter? or, Teamcenter Explained). All of the PLM processes that organizations need can be supported by Teamcenter.

Dassault Systèmes have pursued a similar approach with the launch of their 3DEXPERIENCE platform, which also contains all of the functions required for PLM. In addition, both are actively integrating additional functionality with every new release.

So what is your strategy – Point or Platform? This question deserves serious consideration when considering PLM processes in your organization.

© Tata Technologies 2009-2015. All rights reserved.