Category "PLM Expert Insights"

This post was originally created in January 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Direct Metal Laser Sintering (DMLS).

What is Direct Metal Laser Sintering?

DMLS is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular-based additive manufacturing techniques start with a bed of a powdered material. A laser beam or bonding agent joins the material in a cross-section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering.

The Selective Laser Sintering technique was developed in the mid-1980s by Dr. Carl Deckard and Dr. Joseph Beaman and the University of Texas at Austin, under DARPA sponsorship. As a result of this, Deckard and Beaman established the DTM Corporation with the explicit purpose of manufacturing SLS machines.  In 2001, DTM was purchased by its largest competitor, 3D Systems.

DMLS is the same process as SLS, though there is an industry distinction between the two, so it is important to make note of this. DMLS is performed using a single metal, whereas SLS can be performed with a wide variety of materials, including metal mixtures (where metal is mixed with substances like polymers and ceramics).

What Are the Advantages of this Process?

[…]

There are great engineered products and then there are commercially successful products. Many variables factor into the profitability of a product: innovation, satisfying customer needs and delivering a great customer experience with product performance help companies to drive sales, command price premiums, and boost their topline results. While product development engineers are focused on the form, fit, and function of their designs to drive innovation and a great customer experience, often the product cost impacts of their decisions to drive profitability from the expense perspective are overlooked.

Engineers seldom have visibility to the cost impact of their decisions; they can’t optimize their design parameters for cost in the context of other design parameters, as they don’t have the required information. The biggest challenge to optimizing cost is understanding different cost parameters, and that requires a detailed knowledge of manufacturing processes and cost drivers.

Product Cost Management (PCM) allows product development companies to design for cost by providing early visibility to the cost implications of design decisions. Using PCM processes and tools they can systematically simulate and evaluate different scenarios to develop an ideal “should cost” model that is based on a detail-oriented cost parameters of materials, manufacturing processes, supply chain , regulatory compliance, product support and service. This helps them to identify cost saving ideas like changing materials, simplifying designs, combining parts /functions, or changing production locations.

There are different PCM techniques. Feature-based techniques look at the characteristics of a design to eliminate unnecessary, high-cost design features. Bottoms-up approaches based on Bill of Process (BOP) calculate more accurate cost models based on the manufacturing processes including labor, equipment, tooling, setup, and other production information. It enables companies to perform a “what if” analysis by modeling multiple production scenarios.

The benefits of PCM are not only for manufacturing companies to design their products for optimal cost by receiving feedback on the cost impact of the design decisions – they also help companies that rely on their supply chain to source for optimal pricing. Even though Direct Material Sourcing processes introduce a price competition, they are seldom based on optimum cost. Using PCM, original equipment manufactures (OEMs) can simulate their suppliers’ production costs. Even if no supplier can match the ideal “should cost” price point, it allows supplier selection with the knowledge that they need OEM help to produce at the ideal cost. This again drives continuous investments towards cost improvement in the supply chain; that’s a win-win scenario for both OEMs and Suppliers.

In my next post, I will show you how PLM supports PCM.

This post was originally written in January of 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Laminated Object Manufacturing (LOM).

Laminated Object Manufacturing or LOM works by joining layers of material (usually paper or plastic sheet) with an adhesive while a knife or laser cuts cross-sections to build a complete part. Parts are typically coated with a lacquer or sealer after production.

What Are the Advantages of this Process? […]

Everyone knows that a PLM journey can be a long and expensive path, with frustrations at every turn. The question an organization often asks is: is it worth trying to walk that path?

Effective and correctly implemented PLM can significantly impact several business costs, resulting in large organizational savings. Take a look at the list below and consider how your costs look right now – you may be able to answer your own question.

10 Business Costs Directly Impacted by PLM

  1. Factory Rework and Scrap. These costs can be substantial in a manufacturing organization. Not all rework and scrap is caused by insufficient or miscommunicated engineering and design, but a sizeable percentage is traceable back to this root cause. An effective PLM setup will reduce engineering-originated errors by providing timely and accurate information to the factory floor.
  2. Supplier Quality. Getting timely and accurate information to your suppliers can ensure that they deliver quality parts to your production line. PLM correctly configured can make this happen.
  3. Expedited freight costs. How many times does a product get out of your factories late? In order not to incur penalties, the shipping is expedited at a huge premium. Can any of these incidents be traced back to delayed engineering data? Then a PLM system can help.
  4. Effort to process bids. To win business, you need to respond to RFQs by preparing bids. This effort does not directly generate revenue, and so the preparation process must be as streamlined as possible. Are your key people distracted by bids? Automating the process with a PLM system will reduce the effort required.
  5. Time to create reports. Management requires reports that need to be reviewed. Are these created manually from disparate sources? Why not use a PLM system to generate these reports automatically on demand? There are huge time savings to be had from this enhancement.
  6. Time preparing data for downstream users. How much time does your valuable engineering resource spend extracting, converting, and transmitting engineering data to downstream users? Hours per week? This cost can be avoided completely by setting up a PLM system to deliver this data with no effort from the engineers.
  7. Effort to process engineering change. Your company struggles to process engineering change requests and notices. Many are late and require multiple rework cycles. A PLM can fix that by automating the process and ensuring accurate information.
  8. Cost of physical prototypes. Do you spend a lot of money on building and testing physical prototypes as part of your design process? Do you have to build them all or could some be eliminated by better engineering tools and virtual simulation? A leading-edge PLM system can reduce this dramatically.
  9. Your suppliers deliver parts that require rework. You are constantly getting incorrect parts from your suppliers. But do your suppliers have the right information to begin with? PLM technology can bridge this gap
  10. Wasted development effort. Do you spend funds developing products that go nowhere? This problem can be addressed by a PLM system that manages your development portfolio more accurately.

Do you have more than three of these costs that concern your or that are out of control? Then you definitely need to take a serious look at implementing or reworking your PLM system. We can help – just let us know.

“To specialize or not to specialize, that is the question.”

The question of specializing vs. generalizing has arisen in so many aspects: biology, health, higher education, and of course, software.  When one has to decide between the two ends of the spectrum, the benefits and risks must be weighed.

muskrat_eating_plantAs environments have changed over time, animals have had to make a decision: change or perish. Certain species adapted their biology to survive on plants – herbivores – others, meat 0 carnivores.  When in their preferred environments with ample resources, each can thrive.  However, if conditions in those environments change so that those resources are not as bountiful, they may die out. Then comes the omnivore, whose adaptation has enabled them to survive on either type of resource. With this wider capability of survival, there comes a cost of efficiency. The further you move up through the food chain, the less efficient the transfer of energy becomes.  Plants produce energy, only 10% of which an herbivore derives, and the carnivore that feeds on the herbivore only gets 10% of that 10%; i.e. 1% of the original energy.

Three hundred trout are needed to support one man for a year.
The trout, in turn, must consume 90,000 frogs, that must consume 27 million grasshoppers that live off of 1,000 tons of grass.
— G. Tyler Miller, Jr., American Chemist (1971)

doctor-1149149_640When it comes to deciding on a course of action for a given health problem, people have the option to go to their family doctor, a.k.a. general practitioner, or a specialist. There are “…reams of papers reporting that specialists have the edge when it comes to current knowledge in their area of expertise” (Turner and Laine, “Differences Between Generalists and Specialists“)., whereas the generalist, even if knowledgeable in the field, may lag behind the specialist and prescribe out-of-date – but still generally beneficial – treatments.  This begs the question, what value do we place on the level of expertise?  If you have a life-threatening condition, then a specialist would make sense; however, you wouldn’t see a cardiologist if your heart races after a walk up a flight of stairs – your family doctor could diagnose that you need some more exercise.

graduation-907565_640When it comes to higher education, this choice of specializing or not also exists: to have deep knowledge and experience in few areas, or a shallower understanding in a broad range of applications. Does the computer science major choose to specialize in artificial intelligence or networking? Or none at all? How about the music major?  Specialize in classical or German Polka? When making these decisions, goals should be decided upon first. What is it that drives the person? High salary in a booming market (hint: chances are that’s not German Polka)? Or is the goal pursuing a passion, perhaps at the cost of potential income? Or is it the ability to be valuable to many different types of employers in order to change as the markets do? It’s been shown that specialists may not always command a higher price tag; some employers value candidates that demonstrate they can thrive in a variety of pursuits.

Whether you’re looking to take advantage of specialized design products (for instance, sheet metal or wire harnesses), or gaining the value inherent in a general suite of tools present in a connected PLM platform that can do project management, CAPA, and Bill of Materials management, we have the means. A “Digital Engineering” benchmark can help you decide if specialized tools are right for your company. Likewise, our PLM Analytics benchmark can help you choose the right PLM system or sub-system to implement.

Specialize, or generalize? Which way are you headed and why?

This post was originally created on December 8, 2016.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Fused Deposition Modeling (FDM).

But first, What is Additive Manufacturing?

Additive manufacturing is the process of creating a part by laying down a series of successive cross-sections (a 2D “sliced” section of a part). It came into the manufacturing world about 35 years ago in the early 1980s, and was adapted more widely later in the decade. Another common term used to describe additive manufacturing is 3D Printing – a term which originally referred to a specific process, but is now used to describe all similar technologies.

Now that we’ve covered the basics of 3D Printing, What is Fused Deposition Modeling?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques.  Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in 2D cross section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects (more on this later). The most common form, and the first technology of this type to be developed, is FDM.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement, by allowing other companies to commercialize the technology.

It should also be noted that Fused Deposition Modeling is also known as Fused Filament Fabrication, or FFF. This term was coined by the Reprap community, because Stratasys has a trademark on Fused Deposition Modeling.

What Are the Advantages of this Process? […]

If you are in the business of designing and engineering product, then you have PLM. This is a statement of fact. The question then becomes: what is the technology underpinning the PLM process that is used to control your designs?

Because of the way that technology changes and matures, most organizations have a collection of software and processes that support their PLM processes. This can be called the Point Solution approach. Consider a hypothetical setup below:

The advantage of this approach is that point solutions can be individually optimized for a given process – so, in the example above, the change management system can be set up to exactly mirror the internal engineering change process.

However, this landscape also has numerous disadvantages:

  1. Data often has to be transferred between different solutions (e.. what is the precise CAD model tied to a specific engineering change?). These integrations are difficult to set up and maintain – sometimes to the point of being manual tasks.
  2. The organization has to deal with multiple vendors.
  3. Multiple PLM systems working together require significant internal support resource from an IT department.
  4. Training and onboarding of new staff is complicated

The alternative to this approach is a PLM Platform. Here, one technology solution includes all necessary PLM functionalities. The scenario is illustrated below:

It is clear that the PLM Platform does away with many of the disadvantages of the Point Solution; there is only one vendor to deal with, integrations are seamless, training is simplified, and support should be easier.

However, the PLM Platform may not provide the best solution for a given function when compared to the corresponding point solution. For example, a dedicated project management software may do a better job at Program Management than the functionality in the PLM Platform; this may require organizational compromise. You are also, to some extent, betting on a single technology vendor and hoping that they remain an industry leader.

Some of the major PLM solution vendors have placed such bets on the platform strategy. For example, Siemens PLM have positioned Teamcenter as a complete platform solution covering all aspects of the PLM process. (refer to my earlier blog post What is Teamcenter? or, Teamcenter Explained). All of the PLM processes that organizations need can be supported by Teamcenter.

Dassault Systèmes have pursued a similar approach with the launch of their 3DEXPERIENCE platform, which also contains all of the functions required for PLM. In addition, both are actively integrating additional functionality with every new release.

So what is your strategy – Point or Platform? This question deserves serious consideration when considering PLM processes in your organization.

So you’re a manager at a manufacturing company. You make things that are useful to your customers and you answer to the executives regarding matters such as budgets, efficiencies, timelines and deliverables. You will have at least heard of PLM; perhaps you have attended a conference or two. But how badly do you need to implement it or retool an existing setup?

Here are 10 indicators:

  1. Your staff is always late meeting deadlines. This results from poorly executed projects, inefficient processes, and lack of clear deliverables. All of these problems can be addressed by a PLM system, starting with the enforcement of common processes and followed up by proper project planning.
  2. Department costs are creeping up. You are held to a tight budget by the organization. You are always close to or exceed your budget and it is difficult to get a handle on why. A PLM system can help this in two ways: more efficient processes leading to greater productivity and by providing better information to managers.
  3. Rework is rampant. A lot of work needs to be repeated or reworked because it was not correct the first time. A PLM system can certainly help with this problem by supporting common working practices and introducing checks at crucial points.
  4. Your department is constantly battling other departments. There is a lot of finger pointing and blame that goes around the organization. No one can pin down who is responsible or when information was provided. PLM can provide automatic notifications, timestamped deliverables, and clear and unequivocal instructions.
  5. There is no accountability in your department. It is difficult to diagnose where mistakes were made and who is responsible. People are always blaming other people. A PLM system can provide objective data that allows the root cause of accountability to be addressed.
  6. Overtime is out of control. Excessive overtime worked in your department is always a concern. A PLM system can help improve productivity and give managers better information regarding where inefficiencies exist.
  7. Your competitors always seem better. Your bosses are always holding you up against your competition and showing how they are better. A PLM system can put you ahead because there’s a good chance the competition do not have a PLM system, or have not made good use of it if they do.
  8. External customers complain that they do not get the information they need. You owe your customers information at various stages during the design cycle and they often don’t receive it in a timely manner. A suitably configured PLM system can improve this dramatically.
  9. Your suppliers provide the wrong information. You are constantly going around in circles with your suppliers regarding information. But do your suppliers have the right capabilities to begin with, and do you have the capability to meet them on the same terms? PLM technology can bridge this gap.
  10. Process adherence is poor. Although you have some level of documented processes, adherence is poor. A correctly configured PLM system can fix this quickly.

Do you have three or more of these issues keeping you up at night? Time to take a serious look at a PLM system.

256px-caught_between_a_rock_and_a_hard_placeThere they were, sailing along their merry way. Toward the horizon, a narrow strait approaches. As the boat gets closer, they notice a couple of strange characteristics; to one side a cliff and the other a whirlpool. Upon arrival, it becomes apparent that this is the cliff where the monster Scylla dwells. Looking to the other side, the monster Charybdis, spewing out huge amounts of water, causing deadly whirlpools. Each monster is close enough that to avoid one means meeting the other. Determined to get through, our intrepid hero Ulysses must make a decision.  The idiom “Between Scylla and Charybdis” comes from this story.  In more modern terms, we would translate this to “the lesser of two evils.”

PLM administrators, engineering managers, and IT teams are often give this same choice with equally deadly – well, unfortunate – outcomes. What is this dilemma? Customize the PLM system (beyond mere configuration) to match company policies and processes, or change the culture to bend to the limitations posed by “out of the box” configurations.

Companies will often say something to the effect of “We need the system to do X.” To which many vendors meekly reply “Well, it can’t exactly do X, but it’s close.” So what is a decisionmaker to do? Trust that their organization can adapt? Risking lost productivity and possibly mutiny? Or respond by asking “What will it take to get it to do X?” incurring the risk of additional cost and implementation time.
source-code-583537_1280

We can further elaborate on the risks of each.  When initially developing the customizations, there is the risk of what I call “vision mismatch.”  To the best ability, X is described with a full understanding of the bigger picture that is missed when the developer writes up the specification.  This leads to multiple revisions of the code and frustrations on both sides of the table.  Then, customizations have the longer-term risk of “locking” into a specific version.  While gaining the benefits of keeping your processes perfectly intact, the system is stuck in time unless the customizations are upgraded in parallel.  Some companies will avoid that by never upgrading…until their hardware, operating systems, or underlying software systems become unsupported and obsolete. Then the whole thing can come to a crashing halt.  Hope the backups work!

office-1209640_1280However, not customizing has its own risks. What if the new PLM system is replacing an older “homegrown” system that automated some processes that the new system cannot? (And a “homegrown” system comes with its own set of risks; original coder leaves the company, never commented code, no specifications, etc.)  For example, raising an issue automatically created an engineering change request while starting a CAPA process. The company has gained a manual process, thus exposing them to human error. Or, perhaps the company has policy that requires change orders go through a “four-eyes” approval process, to which the new system has no mechanism to support such a use case.

Customizing is akin to Charybdis, whom Ulysses avoided, deciding that it is better to knowingly lose a few crew members rather than risk losing the entire ship to the whirlpools. Not customizing  is more like Scylla, where there is lower risk, though a much higher probability to the point of almost certainty.

We’ve been through these straits and lived.  We’ve gone through with many companies, from large multi-nationals to the proverbial “ma and pa” shops.  Let us help you navigate the dangers with our PLM Analytics benchmark.

Enterprise-wide PLMdata systems hold huge amounts of business data that can potentially be used to drive business decisions and effect process changes to generate added value for the organization.  Several PLM users are unaware of the existence of such valuable data, while for others, advanced data search and retrieval can feel like looking for a needle in a haystack due to their unfamiliarity with the PLM data model. Hence it is important to process the data into meaningful information and model it into actionable engineering knowledge that can drive business decisions for normal users. Reporting plays a key role in summarizing that large amount of data into a simple, usable format for the purpose of easy understanding.

Reporting starts with capturing the right data – the most important step and, many a time, the least stressed one. When data is not captured in the right format, it results in inconsistent or non-standard data.

Let’s take a simple Workflow process example: Workflow rejection comments are valuable information for companies to understand the repeated reasons for workflow rejection and to improve FTY (First time yield) by developing training plans to address them.  Users might not enter rejection comments unless they are made mandatory, so it’s important to have data-model checks and balances to capture the right data and standardize it through categorization and LOVs (List of values).

reportThe next step is to filter and present the right information to the right people. End Users typically want to run pre-designed reports and maybe slice or dice the data to understand it better. Business Intelligence Designers and Business analysts who understand the PLM Schema and their business relationships are the ones who design the report templates. Report design is sometimes perceived as an IT or software function, and as a result, enough business participation is not ensured, which can have an impact on the effectiveness of the reports for end users. It is important to have business participation from report identification to report design to report usage. Business process knowledge is the key in this area, not the PLM tool expertise alone.

Since business processes get improved/modified based on different market and performance trends derived from PLM reports, it’s important to have continuous improvement initiatives to fine-tune reporting based on these improved processes and new baselines, from data capture to presentation. That makes it a continuous cycle – business processes need to be designed to support reporting and reports need to help improve the process.

Properly designed reports provide increased visibility into shifting enterprise wide status, reduce time and cost for data analysis, ensure quicker response times and faster product launch cycles and improve product quality and integrity.

How do your reports measure up? Do you have any questions or thoughts? Leave a comment here or contact us if you’re feeling shy.

© Tata Technologies 2009-2015. All rights reserved.