Industry Insights

Read articles by industry leading experts at Tata Technoloiges as they present information about PLM products, training, knowledge expertise and more. Sign-up below to receive updates on posts by email.

417px-the_tortoise_and_the_hare_-_project_gutenberg_etext_19994In a race, the quickest runner can never overtake the slowest, since the pursuer must first reach the point whence the pursued started, so that the slower must always hold a lead.

— Aristotle, Physics VI:9, 239b15

This paradox, as first developed by Zeno, and later retold by Aristotle, shows us that mathematical theory can be disproved by taking the hypothesis to an absurd conclusion.  To look at it another way, consider this joke:

A mathematician and scientist are trapped in a burning room.

The mathematician says “We’re doomed! First we have to cover half the distance between where we are and the door, then half the distance that remains, then half of that distance, and so on. The series is infinite.  There’ll always be some finite distance between us and the door.”

The engineer starts to run and says “Well, I figure I can get close enough for all practical purposes.”

The principle here, as it relates to simulation like FEA, is that every incremental step taken in the simulation process gets us closer to our ultimate goal of understanding the exact behavior of the model under a given set of circumstances. However, there is a limit at which we have diminishing returns and a physical prototype must be built. This evolution of simulating our designs has saved a lot of money for manufacturers who, in the past, would have had to build numerous, iterative physical prototypes. This evolution of FEA reminds me of…

2000px-mori_uncanny_valley-svgThe uncanny valley is the idea that as a human representation (robot, wax figures, animations, 3D models, etc.) increases in human likeness, the more affinity people will have towards the representation. That is, however, until a certain point.  Once this threshold is crossed, our affinity for it drops off to the point of revulsion, as in the case of zombies, or the “intermediate human-likeness” prosthetic hands.  However, as the realism continues to increase, the affinity will, in turn, start to rise.

Personally, I find this fascinating – that a trend moving through time can abruptly change direction, and then, for some strange reason, the trend reverts to its original direction. Why does this happen? There are myriad speculations as to why in the Wikipedia page that I’ll encourage the reader to peruse at leisure.

elmer-pump-heatequationBut to tie this back to FEA, think of the beginning of the Uncanny Valley curve as the start of computer assisted design simulation. The horizontal axis is time, vertical axis is accuracy.  I posit that over time, as simulating software has improved, the accuracy of our simulations has also increased. As time has gone on, the ease of use has also improved, allowing non-doctorate holders to utilize simulation as part of their design process.

And this is where we see the uncanny valley; as good as the software is, there comes a point, if you use specialized, intricate, or non-standard analysis, where the accuracy of the software falters. This tells us that there will still be needs for those PhDs, and once they get on the design and start using the software, we see the accuracy go up exponentially.

If you need help getting to the door, or navigating the valley, talk to us about our Simulation benchmark process. Leave a comment or click to contact us.

 

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Selective Laser Sintering (SLS).

But first, What is Additive Manufacturing?

Additive manufacturing is the process of creating a part by laying down a series of successive cross-sections (a 2D “sliced” section of a part). It came into the manufacturing world about 35 years ago in the early 1980s, and was adapted more widely later in the decade. Another common term used to describe additive manufacturing is 3D Printing. A term which originally referred to a specific process, but is now used to describe all similar technologies.

Now that we’ve covered the basics of 3D Printing, What is Selective Laser Sintering?

It is actually part of a broader category, commonly referred to as a Granular Based Technique. All granular based additive manufacturing techniques start with a bed of a powdered material. A laser beam or bonding agent joins the material in a cross section of the part. Then the platform beneath the bed of material is lowered, and a fresh layer of material is brushed over the top of the cross section. The process is then repeated until a complete part is produced. The first commercialized technique of this category is known as Selective Laser Sintering.

The Selective Laser Sintering Technique was developed in the mid-1980s by Dr. Carl Deckard and Dr. Joseph Beaman and the University of Texas at Austin, under DARPA sponsorship. As a result of this, Deckard and Beaman established the DTM Corporation with the explicit purpose of manufacturing SLS machines.  And, in 2001, DTM was purchased by their largest competitor, 3D systems.

What Are the Advantages of this Process?

SLS is quick. It’s one of the fastest rapid prototyping techniques. Though, relatively speaking, most techniques are fast. SLS also has the widest array of usable materials. Theoretically, just about any powdered material can be used to produce parts. In addition, it can potentially be one of the most accurate rapid prototyping processes – the major limiting factor being the particle size of the powdered material.

Because parts are created in a bed of material, there is no need to use support structures like in other forms of rapid prototyping. This helps to avoid secondary operations and machining. Another advantage of the material bed is the ability to stack multiple parts into the build envelope. This can greatly increase the throughput of a SLS machine.

What Are the Disadvantages of this Process?

Of the commercially available rapid prototyping machines, those that use the Selective Laser Sintering technique tend to have the largest price tag. This is usually due to the scale production these machines are designed for, making them much larger than others.

SLS can be very messy. The material used is a bed of powdered material and, if not properly contained, will get EVERYWHERE. In addition, breathing in powdered metals and polymers can potentially be very hazardous to one’s health; though most machines account for this, it is certainly something to be cognizant of when manufacturing.

Unlike other manufacturing processes, SLS limits each part to a single material. This means parts printed on SLS machines will be limited to those with uniform material properties throughout.

As materials aren’t fully melted, full density parts are not created through this process. Thus, parts will be weaker than those created with traditional manufacturing processes, although full density parts can be created through similar manufacturing processes, such as SLM.

In Conclusion

There are quite a few different ways to 3D print a part, with unique advantages and disadvantages of each process. This post is part of a series, discussing the different techniques. Thanks for reading!

When we talk with customers that may have a need to enhance their PLM technology or methods, there are commonly two different schools of thought regarding the subject.  Generally companies start the conversation with one of two different focuses: either CAD-focused or process-focused.

CAD-centric companies are the ones who rely heavily on design and engineering work to support their business.  They generate a lot of CAD data, and eventually this CAD data becomes a real pain to manage effectively with manual processes.  Things get lost, data is hard to locate, design reuse is only marginally successful, and the release process has a lower level of confidence.  These companies usually start thinking about PLM because they need to get their CAD data under control.  They usually start PLM with a minimal approach that is just sufficient to tackle the obvious problem of CAD data management.  Sometimes other areas of PLM are discussed, but are “planned” for a later phase, which inevitably turns into a “much later” phase which still hasn’t happened. What they have done is grease the squeaky wheel while ignoring the corroding frame that is potentially a much bigger problem. CAD-centric companies often benefit from taking step back to look at their processes; many times they will find that is where the biggest problems lie.

BOMs are often associated with CAD geometry, but many times this isn't the case.

BOMs are often associated with CAD geometry, but many times this isn’t the case.

Companies that don’t deal with a lot of CAD data can often realize the benefits of PLM from a process improvement perspective. Product introductions, project management, BOM management, customer requirements, change management, and quality management are just some areas that PLM can help improve. Many process-focused companies already have systems in place to address these topics, but they are often not optimized, and usually not connected.  They tend to be their own individual silos of work or information, which slows the overall “get to market” process, and reduces the overall effectiveness of the business.  These companies might not have the obvious “squeaky wheel” of CAD to manage, but they have PLM challenges just the same.  The key to improvement with them is to identify the challenges and actually do something about them.

In either case, Tata Technologies has the people and processes to help identify and quantify your company’s biggest challenges through our PLM Analytics process.  This process was developed specifically to address the challenges companies have in identifying and quantifying areas for PLM improvement.  If you’re interested in better identifying areas of improvement for your company’s PLM process, just let us know.  We’re here to help.

 

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding the common practices and techniques. So, this week’s blog post will address a common type of 3D printing known as Stereolithography.

But first, What is Additive Manufacturing?

Additive manufacturing is the process of creating a part by laying down a series of successive cross-sections (a 2D “sliced” section of a part). This technology came into the manufacturing world about 35 years ago in the early 1980s, and was adapted more widely later in the decade. A more common term used to describe additive manufacturing is 3D Printing – a term which originally referred to a specific process, but is now used to describe all similar technologies.

Now that we’ve covered the basics of 3D Printing, What is Stereolithography?

Stereolithography is the process of building an object by curing layers of a photopolymer, which is a polymer that changes properties when exposed to light (usually ultraviolet light). Typically this causes the material to solidify, or cure.

This technique uses a bath or vat of material. An Ultraviolet Laser will cure a layer of photopolymer on a platform. The platform is then lowered into the bath, and another layer of material is cured over the top of it.

A variation on this technique, referred to as Poly or Multi-Jet printing, has a slight modification to the process. Instead of using a bath of material, Jet printing uses separate reservoirs of material, which are fed through a UV laser. The material reservoirs in this process are quite similar to inkjet printer cartridges, and function similarly to an inkjet printer. This technique was developed by Objet Technologies, which was acquired by Stratasys in 2012.

What Are the Advantages of this Process?

Stereolithography is fast. Working prototypes can easily be manufactured within a short period of time. This, however, is greatly dependent on the overall size of the part.

SLA is one of the most common rapid prototyping techniques used today. It has been widely adopted by a large variety of industries, from medical, to automotive, to consumer products.

The SLA process allows for multiple materials to be used on one part. This means that a single part can have many several different structural characteristics and colors, depending on where material is deposited. In addition, all of the materials used in SLA are cured through the same process. This allows for materials to be blended during manufacturing, which can be used to create custom structural characteristics. It should be noted, however, that this is only available with either the MultiJet or PolyJet SLA machines.

Of the all the technologies available, SLA is considered to be the most accurate. Capable of holding tolerances under 20 microns, accuracy is one of the largest benefits to this technique.

What Are the Disadvantages of this Process?

Historically, due to the specialized nature of the photopolymers used in this process, material costs were very high compared to other prototyping processes.  They could be anywhere from $80 to over $200 per pound. The cost of a machine is considerably large as well, ranging anywhere from $10k to well over $100k. Though recently, a renewed interest in the technology has introduced more consumer grade SLA machines, which has helped to drive down prices. New material manufacturers have also appeared in recent years (spot-A Materials and MakerJuice Labs), which has cut prices drastically.

Stereolithography is a process that requires the use of a support structure. This means that any part produced with this technique will require a secondary operation post-fabrication.

In Conclusion

There are quite a few different ways to 3D print a part, with unique advantages and disadvantages of each process. This post is the first part of a series, discussing the different techniques. Thanks for reading!

ilogic-snipSometimes CAD can be used to start establishing PLM practices. Since PLM systems rely on data to be effective, ensuring consistent and correctly-entered information is paramount. Things like classification with properties and meta-data can rely on CAD very heavily to be effectively used. For example, let’s consider the classification and data for a machined part. If the part is going to require machining, we could assign it a classification of “Machined.” Since the part is going to be machined, we would want to ensure that “Stock Size” is one piece of meta-data to be tracked. Most CAD systems have a way to ensure this “Stock Size” is at least filled out, and some could even be automated to calculate the stock size without any user intervention. Of course a repeatable logic would need to be utilized, but once that is done, time spent completing stock size calculations and potential errors would be eliminated.

 

Case in point: Utilize iLogic in Autodesk Inventor to calculate stock size for machined parts. Once this is done, users can forget about manually checking all the measurements; all they need to do is flag the part as “Machined” and the system does the rest!

boltvolume

It’s that time of year in the Siemens PLM world.  Siemens NX 11 was released in late August, and new functionality and streamlined tools always gets me excited. This year marks the 10th year I’ve been using Siemens products, which I realize is a short time in our industries, but it does make me feel a bit nostalgic about my introduction to them.design_1

10 years ago, I graduated college with my B.S. in Mechanical Engineering, and gained my first job as a Project Engineer with a company that produced high-end carbon fiber performance parts.  A friend of mine, with whom I had gone to college, was also starting there at around the same time, and when I asked him which design software we would be using, he informed me it would be a product called NX. At that time, not being overly well-versed on all the options for CAD in the marketplace, I was not familiar with Siemens NX and worried that I was about to become experienced in a piece of software that wasn’t widely used.  As I said, I was not very well aware of the true marketplace!

We started on NX 2, and it would be new software for the company so, as the young engineers, we were to prove out what it was really capable of.  From the very beginning, I took to the software much quicker than I ever had when using PTC or Works while in school. NX offered not only ease of use but powerful design tools I had never had access to before. Since we were a manufacturing shop as well, we picked up NX CAM to program our NC Mills and Lathes to produce fixtures and tooling used to create our parts.  Once again, new software, new capability, but nobody knew it, so it fell again to us to learn another part of the software. Eventually, we also procured Femap to do Static and dynamic load analyses on our composite layups to ensure part strength and durability (we were creating carbon fiber prosthetic ankles at that time that had to cycle through millions of steps over the course of a month to pass quality requirements).  So within a year, I had come to know the CAD, CAM, and CAE side of Siemens applications quite well, and I continued to learn and grow with the software during my years there.

design_2

Fast forward 10 years, a few jobs, and countless projects and experiences with Siemens products, and I still find myself impressed.  I remember when Synchronous Technology was first released, and the impact it had on the industry.  I remember year after year of functionality improvement, GUI improvement, dialog improvement, system stability and capability improvements.  I remember the advancement of freeform tools, and the “wows” as users and prospective users found ways to do their jobs they had never seen before.  The Siemens product line itself has continued to grow and become more diverse over that time, delving into every aspect of modern product design, from industrial styling to noise and vibration analyses. Siemens’ acquisitions of industry-leading software companies, and the integration of those technologies into their flagship products, have positioned them as a world leader in digital engineering, digital manufacturing, and Product Lifecycle Management.

I feel lucky that I have been able to touch so many different aspects of the software over the last 10 years, and I am always amazed at the improvements that come with each and every release.

Siemens PLM continues their long history of creating the most powerful and flexible design software in the world today. And as for NX 11, I covered some of the most exciting new features and functionalities in a webinar we hosted just last month. Missed my presentation the first time around? Click here to watch it on demand.

This quote embodies a phenomenon in today’s entertainment media culture: the Alternate Reality Game, or ARG. These ARGs aim to bring the audience deeper into the product experience, to rev up the hype. The game’s producers build an elaborate web of interconnected “plot points,” for lack of a better term. These plot points are not overt, not communicated outright. The audience has to dig for it by picking up clues and solving puzzles that lead to the next, and so on.

A recent entry into the world of ARGs is Blizzard Entertainment’s new game, Overwatch – a game whose players are very interested in information about the next character planned for release. It started with a list of character names on a piece of paper seen in the game, with one exception. That started the rumors that this could be the next character to be revealed. Next, Blizzard released a couple of videos on YouTube where a seemingly innocuous blip (below left), actually turned out to be an image; the colors had to be enhanced to reveal a series of numbers (below right). A member of the community converted the hex values to ASCII, and XOR’ed them with the number 23, converting the result back to hexadecimal and mapping to readable letters…which turned out to be a phrase written in Spanish.

combined

IFphkqCAnother clue puzzle was solved when a similar strange anomaly appeared in another video. It was a series of images that looked like the color patterns you used to see on a TV station late at night after they stopped broadcasting. One of the other images was a series of horizontal black and white lines. One player took those lines, turned them 90 degrees, and read them via a bar code reader. The result was, of course, more hex values, which were converted into binary. Through some form of applied computer science, taking the binary code into pixels where 1s were black and 0s were white ultimately revealed a QR code. What did the code reveal? “Was that easy? Now that I have your attention, let’s make things more difficult.” As of the writing of this blog, the ARG is still alive and well, with more pieces being revealed regularly.

Where Does PLM Come In?

09 0C 0B 0E 0B 17 04 12 19 0B 11 06 19 03 12 17 18 03 02 19 18 18 15 04 05So why is this story in a PLM Insights blog? Well, I’ve seen many companies treat their design and engineering data like an ARG – meaning that lots of people are interested in it, it’s all over the place, and only a few (smart and creative) people really know how to find it. Whether it’s the “smart” part naming scheme from the late 80s that needs some kind of secret decoder ring, or the folder naming conventions where names serve as some sort of obscure meta data for the files contained in them.

An example part file (the names and numbers have been changed to protect the innocent):

S:\22648 COMP-LOGISTICS TAILCROSSER BG-F813\Units Designed By ABC\22648-9399 OP 30 Backup ASM Fixture reworked (same as 22648-9899) See XYZ Folder for new Op80 design\OP80 reworked 4-16-08 same as 22648-9899\Libraries\OP30\Purchase\Stndrd\fd65645_2

Here we have used just about every bit of the Windows character limit (which is 260, for those interested: 3 for the volume designation, 256 for path and file name with extension, and a null terminating character). Anyone that can manage files this way is clearly a talented individual. Much more impressive is that they were part of a 23-person design team that did all of their projects this way. I couldn’t imagine the frustrations of their new hires trying to find anything in this kind of environment.

The benefits of searching for data are pretty clear (see Google). Yet to this day, companies are still using antiquated methods because they think it’s “cheaper” than implementing a PLM system. Our PLM Analytics benchmark and impact analyses have proven otherwise, and that doesn’t include the myriad other benefits a PLM system offers. Let us know if you’re done playing the engineering and design ARG!

FYI, there is no ARG in this post…or is there?

Among technology practitioners, there is no shortage of pundits offering predictions into the future and where the next big wave is going to hit. The reason for this is that the stakes are high – a correct forecast of future technology trends can literally be worth billions.

So what are the current predictions talking about? Here is a sampling of the current buzz:

  1. Big Data
  2. Social Media
  3. Crowd Sourcing
  4. Social Computing
  5. Mobile Connectivity

So how does this impact PLM? Traditionally PLM is conducted on internal infrastructure in secured environments using traditional devices. For example, an average designer concerned with the creation of a 3D CAD data would be working on a company workstation behind a firewall. Equally, an engineer creating BOM data would be using a secured client installed on his company laptop. The possibility exists that the engineer may take his laptop home and interact with the PLM system via a VPN but this is probably the extent of “mobility.”

Returning to the technology buzz, consider the potential impact of two trends – mobile connectivity and social computing. Consider the following scenarios:

  1. Your newly recruited engineer has transitioned his digital life to his tablet and no longer uses a laptop. (hence the title of this piece)
  2. The VP of Engineering wants to query the status of his product introduction using his mobile phone.
  3. Your company wants immediate access to customer feedback on existing products so that this can be translated into requirements for new or updated designs.

Given the traditional model sketched our earlier, implementing anything close to these scenarios is almost impossible. The infrastructure, mindset, and processes will not support mobile connectivity from alternative devices nor allow general access to a requirements gathering front end. Also, it raises a whole lot of questions around data security, use of private devices, and non-company access. While the technology to achieve these scenarios probably exists, it would require considerable financial and effort investment to make it happen.

This leads to the fundamental risk investment equation. It may be possible to construct a business case that justifies the outlay. At a high level, two possibilities exist:

  1. Traditional PLM infrastructure is good enough for at least the next ten years and can be re-evaluated then.
  2. Changing the way business is conducted is a do or die activity and this includes PLM.

An informal survey of small to medium size companies shows that most participants have not even considered these technology trends. In part, there appears to be no business imperative and in part because there are other more attractive avenues for immediate investment.

So, do you want your engineers to be doing all their engineering work on a tablet anywhere in the world?

You have a PLM system. Fundamental to this system is the concept of a version and a revision. However, this is probably the most misunderstood process in the PLM realm. Also these terms mean a wide variety of things to different people and are often used interchangeably and without consistency.

For the purposes of the rest of this piece, we will use the following definitions:

Version – represents a small incremental change in the design that would be saved in the database. Versions are not necessarily saved permanently beyond a revision.

Revision – represents a significant event in the design process and is saved permanently in the database for reference throughout the design process.

Diagrammatically, the difference is illustrated below:

Version Revision

It is often confusing to talk to about this subject because the terms are used interchangeably. Also, the distinction between a version and a revision is not clearly understood, even to the extent that participants think that they are the same thing. Because of this, it is important that any organization with a PLM system ensure that all the participants clearly understand the definition and what the difference is.

In a collaborative PLM environment, participants are very dependent on viewing or using data generated by other participants. For example, a headlamp engineer needs the position of locating holes in the sheetmetal to be able to design his locating pins (if this is the order of precedence). In this scenario, the headlamp engineer will say “I need the latest sheetmetal to begin my design.” This statement is common in design and engineering teams. However, it is inherently imprecise because it begs the question: Do you need the latest version or latest revision?

Based on the definition given earlier, what is really required is the latest revision. A version is a work in progress and could be incomplete or half-done because the responsible author may be in the middle of a redesign or new concept. For this reason, a version should not be visible to the larger organization; only revisions should be accessible, as they satisfy the definition of “best so far.” This concept is very difficult to get across to a lot of people and represents the conundrum referred to in the title. It takes some courage to work on data that will change sometime in the future, but this is absolutely required in an efficient design process.

The version/revision conundrum also leads to some interesting human psychology. Consider any collaborative design environment where multiple participants have to submit data into a PLM system to progress a large project. It is important in these environments that all participants follow the mantra of “publish early, publish often” or, per the nomenclature of this piece, create many revisions. This is based on the principle that incomplete or slightly inaccurate data is better than no data at all.

However, process managers often put in systems that highlight inaccuracies or incomplete data, effectively punishing early publishers. So data authors hold back and only create revisions when they are certain of accuracy, late in the process. This is counterproductive.

So pay attention to the version/revision conundrum; clear definitions and policies of this simple issue can greatly improve a PLM process.

 

A Bill of Material (BOM) at its core is a very simple concept: a list of components needed to manufacture a finished product. So if one was making a pair of spectacles, the BOM may look as follows:

Finished Product Spectacles Quantity
Item 1 Right Lens 1
Item 2 Left Lens 1
Item 3 Frame 1
Item 4 Hinge 2

It must be said that understanding how a BOM functions is fundamental to understanding how PLM systems work. This simple list is really at the core of the PLM system. However, simple concepts have a tendency to escalate into very complex subjects. And so it is with a BOM.

One of the complexities associated with a BOM is that an organization usually has a requirement for different types of a BOM in order to define a single product. Most manufacturing companies have at least three types:

  1. EBOM (Engineering BOM) is the list of parts that engineers are responsible for and comprises all the components that require some sort of design input.
  2. MBOM (Manufacturing BOM) is the list of parts that are required to actually make the product. This is typically different from EBOM by components that engineering do not specifically design (glue strips, liquid fills etc.). It may also be plant specific.
  3. XBOM (Service BOM) is an as built list of parts used in a product that actually made it off the factory floor. This may be different from what was originally specified by the MBOM because of crisis during manufacture. It is important from a customer service perspective.

So the question is: how are your three BOMs authored, edited, maintained, and released? Whatever the answer to this question, the outcome is always the same:

  1. No BOM – No product
  2. Wrong BOM – Factory rework or customer dissatisfaction.

An informal survey of small to medium size companies yields surprising results: Excel is the predominant BOM management tool in an engineering environment. Manufacturing BOMs are normally handled by some sort of ERP system and service BOMs are poorly tracked, if at all. This situation is fraught with potential for disaster because of all the manual processes that have to occur before an actual product gets made.

Hence the analogy in the title. BOM management may be a hidden problem that is set to explode in an organization, especially as the products being made become more complex. PLM systems can offer a single organized BOM that represents all the different types in a consistent, controlled manner. Given the potential consequences of the bomb exploding, BOM in PLM should be a priority.

Do you have a BOM management disaster of your own to share? How about a BOM management triumph?

© Tata Technologies 2009-2015. All rights reserved.