Category "PLM Expert Insights"

It’s that time of year in the Siemens PLM world.  Siemens NX 11 was released in late August, and new functionality and streamlined tools always gets me excited. This year marks the 10th year I’ve been using Siemens products, which I realize is a short time in our industries, but it does make me feel a bit nostalgic about my introduction to them.design_1

10 years ago, I graduated college with my B.S. in Mechanical Engineering, and gained my first job as a Project Engineer with a company that produced high-end carbon fiber performance parts.  A friend of mine, with whom I had gone to college, was also starting there at around the same time, and when I asked him which design software we would be using, he informed me it would be a product called NX. At that time, not being overly well-versed on all the options for CAD in the marketplace, I was not familiar with Siemens NX and worried that I was about to become experienced in a piece of software that wasn’t widely used.  As I said, I was not very well aware of the true marketplace!

We started on NX 2, and it would be new software for the company so, as the young engineers, we were to prove out what it was really capable of.  From the very beginning, I took to the software much quicker than I ever had when using PTC or Works while in school. NX offered not only ease of use but powerful design tools I had never had access to before. Since we were a manufacturing shop as well, we picked up NX CAM to program our NC Mills and Lathes to produce fixtures and tooling used to create our parts.  Once again, new software, new capability, but nobody knew it, so it fell again to us to learn another part of the software. Eventually, we also procured Femap to do Static and dynamic load analyses on our composite layups to ensure part strength and durability (we were creating carbon fiber prosthetic ankles at that time that had to cycle through millions of steps over the course of a month to pass quality requirements).  So within a year, I had come to know the CAD, CAM, and CAE side of Siemens applications quite well, and I continued to learn and grow with the software during my years there.

design_2

Fast forward 10 years, a few jobs, and countless projects and experiences with Siemens products, and I still find myself impressed.  I remember when Synchronous Technology was first released, and the impact it had on the industry.  I remember year after year of functionality improvement, GUI improvement, dialog improvement, system stability and capability improvements.  I remember the advancement of freeform tools, and the “wows” as users and prospective users found ways to do their jobs they had never seen before.  The Siemens product line itself has continued to grow and become more diverse over that time, delving into every aspect of modern product design, from industrial styling to noise and vibration analyses. Siemens’ acquisitions of industry-leading software companies, and the integration of those technologies into their flagship products, have positioned them as a world leader in digital engineering, digital manufacturing, and Product Lifecycle Management.

I feel lucky that I have been able to touch so many different aspects of the software over the last 10 years, and I am always amazed at the improvements that come with each and every release.

Siemens PLM continues their long history of creating the most powerful and flexible design software in the world today. And as for NX 11, I covered some of the most exciting new features and functionalities in a webinar we hosted just last month. Missed my presentation the first time around? Click here to watch it on demand.

This quote embodies a phenomenon in today’s entertainment media culture: the Alternate Reality Game, or ARG. These ARGs aim to bring the audience deeper into the product experience, to rev up the hype. The game’s producers build an elaborate web of interconnected “plot points,” for lack of a better term. These plot points are not overt, not communicated outright. The audience has to dig for it by picking up clues and solving puzzles that lead to the next, and so on.

A recent entry into the world of ARGs is Blizzard Entertainment’s new game, Overwatch – a game whose players are very interested in information about the next character planned for release. It started with a list of character names on a piece of paper seen in the game, with one exception. That started the rumors that this could be the next character to be revealed. Next, Blizzard released a couple of videos on YouTube where a seemingly innocuous blip (below left), actually turned out to be an image; the colors had to be enhanced to reveal a series of numbers (below right). A member of the community converted the hex values to ASCII, and XOR’ed them with the number 23, converting the result back to hexadecimal and mapping to readable letters…which turned out to be a phrase written in Spanish.

combined

IFphkqCAnother clue puzzle was solved when a similar strange anomaly appeared in another video. It was a series of images that looked like the color patterns you used to see on a TV station late at night after they stopped broadcasting. One of the other images was a series of horizontal black and white lines. One player took those lines, turned them 90 degrees, and read them via a bar code reader. The result was, of course, more hex values, which were converted into binary. Through some form of applied computer science, taking the binary code into pixels where 1s were black and 0s were white ultimately revealed a QR code. What did the code reveal? “Was that easy? Now that I have your attention, let’s make things more difficult.” As of the writing of this blog, the ARG is still alive and well, with more pieces being revealed regularly.

Where Does PLM Come In?

09 0C 0B 0E 0B 17 04 12 19 0B 11 06 19 03 12 17 18 03 02 19 18 18 15 04 05So why is this story in a PLM Insights blog? Well, I’ve seen many companies treat their design and engineering data like an ARG – meaning that lots of people are interested in it, it’s all over the place, and only a few (smart and creative) people really know how to find it. Whether it’s the “smart” part naming scheme from the late 80s that needs some kind of secret decoder ring, or the folder naming conventions where names serve as some sort of obscure meta data for the files contained in them.

An example part file (the names and numbers have been changed to protect the innocent):

S:\22648 COMP-LOGISTICS TAILCROSSER BG-F813\Units Designed By ABC\22648-9399 OP 30 Backup ASM Fixture reworked (same as 22648-9899) See XYZ Folder for new Op80 design\OP80 reworked 4-16-08 same as 22648-9899\Libraries\OP30\Purchase\Stndrd\fd65645_2

Here we have used just about every bit of the Windows character limit (which is 260, for those interested: 3 for the volume designation, 256 for path and file name with extension, and a null terminating character). Anyone that can manage files this way is clearly a talented individual. Much more impressive is that they were part of a 23-person design team that did all of their projects this way. I couldn’t imagine the frustrations of their new hires trying to find anything in this kind of environment.

The benefits of searching for data are pretty clear (see Google). Yet to this day, companies are still using antiquated methods because they think it’s “cheaper” than implementing a PLM system. Our PLM Analytics benchmark and impact analyses have proven otherwise, and that doesn’t include the myriad other benefits a PLM system offers. Let us know if you’re done playing the engineering and design ARG!

FYI, there is no ARG in this post…or is there?

Among technology practitioners, there is no shortage of pundits offering predictions into the future and where the next big wave is going to hit. The reason for this is that the stakes are high – a correct forecast of future technology trends can literally be worth billions.

So what are the current predictions talking about? Here is a sampling of the current buzz:

  1. Big Data
  2. Social Media
  3. Crowd Sourcing
  4. Social Computing
  5. Mobile Connectivity

So how does this impact PLM? Traditionally PLM is conducted on internal infrastructure in secured environments using traditional devices. For example, an average designer concerned with the creation of a 3D CAD data would be working on a company workstation behind a firewall. Equally, an engineer creating BOM data would be using a secured client installed on his company laptop. The possibility exists that the engineer may take his laptop home and interact with the PLM system via a VPN but this is probably the extent of “mobility.”

Returning to the technology buzz, consider the potential impact of two trends – mobile connectivity and social computing. Consider the following scenarios:

  1. Your newly recruited engineer has transitioned his digital life to his tablet and no longer uses a laptop. (hence the title of this piece)
  2. The VP of Engineering wants to query the status of his product introduction using his mobile phone.
  3. Your company wants immediate access to customer feedback on existing products so that this can be translated into requirements for new or updated designs.

Given the traditional model sketched our earlier, implementing anything close to these scenarios is almost impossible. The infrastructure, mindset, and processes will not support mobile connectivity from alternative devices nor allow general access to a requirements gathering front end. Also, it raises a whole lot of questions around data security, use of private devices, and non-company access. While the technology to achieve these scenarios probably exists, it would require considerable financial and effort investment to make it happen.

This leads to the fundamental risk investment equation. It may be possible to construct a business case that justifies the outlay. At a high level, two possibilities exist:

  1. Traditional PLM infrastructure is good enough for at least the next ten years and can be re-evaluated then.
  2. Changing the way business is conducted is a do or die activity and this includes PLM.

An informal survey of small to medium size companies shows that most participants have not even considered these technology trends. In part, there appears to be no business imperative and in part because there are other more attractive avenues for immediate investment.

So, do you want your engineers to be doing all their engineering work on a tablet anywhere in the world?

You have a PLM system. Fundamental to this system is the concept of a version and a revision. However, this is probably the most misunderstood process in the PLM realm. Also these terms mean a wide variety of things to different people and are often used interchangeably and without consistency.

For the purposes of the rest of this piece, we will use the following definitions:

Version – represents a small incremental change in the design that would be saved in the database. Versions are not necessarily saved permanently beyond a revision.

Revision – represents a significant event in the design process and is saved permanently in the database for reference throughout the design process.

Diagrammatically, the difference is illustrated below:

Version Revision

It is often confusing to talk to about this subject because the terms are used interchangeably. Also, the distinction between a version and a revision is not clearly understood, even to the extent that participants think that they are the same thing. Because of this, it is important that any organization with a PLM system ensure that all the participants clearly understand the definition and what the difference is.

In a collaborative PLM environment, participants are very dependent on viewing or using data generated by other participants. For example, a headlamp engineer needs the position of locating holes in the sheetmetal to be able to design his locating pins (if this is the order of precedence). In this scenario, the headlamp engineer will say “I need the latest sheetmetal to begin my design.” This statement is common in design and engineering teams. However, it is inherently imprecise because it begs the question: Do you need the latest version or latest revision?

Based on the definition given earlier, what is really required is the latest revision. A version is a work in progress and could be incomplete or half-done because the responsible author may be in the middle of a redesign or new concept. For this reason, a version should not be visible to the larger organization; only revisions should be accessible, as they satisfy the definition of “best so far.” This concept is very difficult to get across to a lot of people and represents the conundrum referred to in the title. It takes some courage to work on data that will change sometime in the future, but this is absolutely required in an efficient design process.

The version/revision conundrum also leads to some interesting human psychology. Consider any collaborative design environment where multiple participants have to submit data into a PLM system to progress a large project. It is important in these environments that all participants follow the mantra of “publish early, publish often” or, per the nomenclature of this piece, create many revisions. This is based on the principle that incomplete or slightly inaccurate data is better than no data at all.

However, process managers often put in systems that highlight inaccuracies or incomplete data, effectively punishing early publishers. So data authors hold back and only create revisions when they are certain of accuracy, late in the process. This is counterproductive.

So pay attention to the version/revision conundrum; clear definitions and policies of this simple issue can greatly improve a PLM process.

 

A Bill of Material (BOM) at its core is a very simple concept: a list of components needed to manufacture a finished product. So if one was making a pair of spectacles, the BOM may look as follows:

Finished Product Spectacles Quantity
Item 1 Right Lens 1
Item 2 Left Lens 1
Item 3 Frame 1
Item 4 Hinge 2

It must be said that understanding how a BOM functions is fundamental to understanding how PLM systems work. This simple list is really at the core of the PLM system. However, simple concepts have a tendency to escalate into very complex subjects. And so it is with a BOM.

One of the complexities associated with a BOM is that an organization usually has a requirement for different types of a BOM in order to define a single product. Most manufacturing companies have at least three types:

  1. EBOM (Engineering BOM) is the list of parts that engineers are responsible for and comprises all the components that require some sort of design input.
  2. MBOM (Manufacturing BOM) is the list of parts that are required to actually make the product. This is typically different from EBOM by components that engineering do not specifically design (glue strips, liquid fills etc.). It may also be plant specific.
  3. XBOM (Service BOM) is an as built list of parts used in a product that actually made it off the factory floor. This may be different from what was originally specified by the MBOM because of crisis during manufacture. It is important from a customer service perspective.

So the question is: how are your three BOMs authored, edited, maintained, and released? Whatever the answer to this question, the outcome is always the same:

  1. No BOM – No product
  2. Wrong BOM – Factory rework or customer dissatisfaction.

An informal survey of small to medium size companies yields surprising results: Excel is the predominant BOM management tool in an engineering environment. Manufacturing BOMs are normally handled by some sort of ERP system and service BOMs are poorly tracked, if at all. This situation is fraught with potential for disaster because of all the manual processes that have to occur before an actual product gets made.

Hence the analogy in the title. BOM management may be a hidden problem that is set to explode in an organization, especially as the products being made become more complex. PLM systems can offer a single organized BOM that represents all the different types in a consistent, controlled manner. Given the potential consequences of the bomb exploding, BOM in PLM should be a priority.

Do you have a BOM management disaster of your own to share? How about a BOM management triumph?

Fundamental to any PLM system is the idea of Access Control and data security. Only authorized personnel can access a PLM system and view or manipulate its contents. This is controlled via a login procedure that includes a user password. Personnel are added to the list of authorized users by the PLM administrator after someone has approved of their specific access rights.

Once access has been granted to users, it must then be determined what operations they can carry out on the PLM system. The simplest (and default) security model which allows all users to carry out any operation is very undesirable and could lead to actions that can destroy or leak vital data.

This scenario requires the development of a Security model which determines which user can carry out which operations. Security models are normally based on two concepts:

  1. Roles
  2. Organizations

A role in the database would define what the user who is assigned that role is allowed to do. Typical roles are as follows:

  1. Viewer – this role would be allowed to view data but not make any alterations or modifications
  2. Team Member – this role would be allowed to alter and update a limited subset of the data along with being able to carry out certain operations (e.g. initiate a workflow)
  3. Team Leader – this role would be able to do everything that a Team Member could do along with the ability to operate on a larger subset of data and carry out more operations (e.g. progress a workflow, change ownership)
  4. Approver – this role would be able to approve certain operations on the data (e.g. approve a release of information)
  5. Database Admin – normally limited to a handful of technically qualified people

Once roles in a database have been defined, the organizations are put in place. These normally mirror actual organizational structure, although this is not a necessity. Organizations in a PLM system usually work on specific projects or programs. Once the organization is defined, users are allocated to various organizations and are assigned specific roles.

The final result can be represented in a table as follows:

Within Organization Outside Organization
User Role View Modify Approve View Modify  
John Doe Team Leader Y Y N Y N
Paul Revere Team Member Y Y N N N
David Earp Approver Y N Y Y N

So how is security set up in your PLM system? Are all the security capabilities been used to ensure that no intellectual property is destroyed or leaked?

Back in the day…

There it was, one of the first internet communities, Usenet, about to undergo a sea-change unlike any it had seen before. It was 1993, September, a month that would never end.

IT - Ethernet Cable outletIt started much like the years had before; an influx of new people coming into the universities, getting online for the first time. The community absorbed them in much the same manner as they had in the past. These first-timers were indoctrinated with the well-established etiquette and protocols that were required to thrive in this brave new world.

It seems archaic now, but back then, in the “before times”, there was no way for mass discussion; social media had not yet been born.

The plot twist

And then it happened. AOL, then a name synonymous with the internet, decided to grant access to Usenet for all of its customers. Picture the mobs that gather outside department stores the morning after Thanksgiving: the unlocking of the door let loose a mass of people that overwhelmed the community. There were just not enough graceful souls able to help coach these new users in “civilized” net behavior. Social norms were thrashed; standards went out the window. It was the equivalent of the wild, wild west. In a word, it was chaos.

Future looking

Misc-Walking-peopleNow think of how you on-board new designers or engineers. You show them who’s helpful and who to avoid. You show them around, pointing out places of interest, teach them company standards, design methodologies, workflow processes, etc. Over the coming decade (to be exact, 2014 through 2024), according to stats provided by the Bureau of Labor Statistics (BLS), the Architecture and Engineering field will grow an average of 3.4%, or about 710,000 jobs.

The biggest (projected) job gainers:

  • Civil – 106,700
  • Mechanical – 102,500
  • Industrial – 72,800
  • Electrical – 41,100

Manufacturing - SuspensionCouple this with the BLS projection of labor force participation over the same time period where we’ll see a 1:1.3 ratio of people leaving the work force to people entering. That will be a lot of churn, meaning a lot of people to on-board. The products will be ever more complicated, and the enabling technology will be as well. Technology is cited as one of the reasons the field isn’t growing as fast as other areas.  The productivity gains in PLM are making companies more efficient, even as the complexity grows.

Conclusion

Business - Chess pawn inverseCompanies will need a strategy for managing changes in their employee base as well as the technology evolution. We offer a series of benchmarking and analysis services called PLM Analytics, and there is one specifically aimed at this issue called PLM Support. Let us know if we can help solve your Eternal September.

So your company has embarked on the PLM journey. Strategy is agreed, budget is approved, the preliminary plan for execution is in place and Return on Investment has been computed.

The next step in the process is choosing a software suite and an associated vendor. Unfortunately, the nature of software is such that one cannot mix and match programs or modules to suit specific requirements; the major vendors design their solutions is such a way that organizations are locked into a specific monolithic offering. The choice of vendor then has long term ramifications and, on the face of it appears to be a momentous decision.

So, how does one choose a PLM technology vendor? For the purposes of answering, let us submit two potential techniques:

  1. Undertake a comprehensive study to evaluate the merits of each vendors solution against requirements, conduct benchmarks and produce recommendations (The bake off)
  2. Meet in the main boardroom of the company, ensure attendance of auditors and all interested parties and toss a coin to decide which vendor to choose (The coin toss)

Before debating the merits and demerits of each technique, it is instructive to outline a methodology for the bake off option. The high level steps required to conduct a study are as follows:

  1. Outline business imperatives and goals (e.g. global engineering)
  2. Identify the PLM processes that have to be put in place or facilitated to meet these goals (e.g. extended design reviews)
  3. Create use cases to illustrate the processes (e.g. ability to load complete product into webex session and have geographically dispersed teams critique)
  4. Evaluate each technology against the use case and score its capability to support the use case (e.g. how long does it take to load a product into a review session)
  5. Total up all the scores and make a recommendation.

Clearly the bake off would be the conventional business approach. It offers the advantages of rigor, objectivity and a comprehensive approach. By following the evaluation methodology, an organization is guaranteed of having a technology that supports its needs.

So why even consider the coin toss? Any business person worth his salt would recoil at the thought of employing such a sloppy and unscientific method. But before dismissing this out of hand, consider a few items. Firstly, technology changes at an alarming pace and what is good today in one vendors solution will be outpaced by next year’s release. Secondly, the tough part of PLM implementations is managing organizational change and this has nothing to do with technology. Thirdly, agonizing over decisions is probably worse than making a snap decision – fortune favors the bold.

So consider the coin toss or a at least a compressed bake off; it can certainly save time and maybe allow an organization to leap ahead of its competition.

Need help making your plan? Our PLM Analytics Benchmark process will give you the tools you need for your bake-off. Click here to request a session.

So you’re an executive at a manufacturing company. You make things that are useful to your customers and you return profits to ever-demanding shareholders. You have probably heard of PLM before; perhaps your staff have mentioned the acronym. But how badly do you need it?

Here are 10 indicators that you definitely need PLM:

  1. Your engineering organization is often late meeting customer deadlines. This results from poorly executed projects, inefficient processes and lack of clear deliverables. All of these problems can be addressed by a PLM system supporting the engineering organization.
  2. Warranty costs are creeping up. One of the largest contributors to poor product quality is sloppy design and incomplete engineering definition. Installing appropriate PLM technology to support design activities results in a better specification been communicated to manufacturing.
  3. Factory scrap rates are above industry standards. For example, scrap and rework is often traced back to a wrong drawing, an incorrect dimension or a poorly specified component. Complete and accurate product design is supported by a robust PLM system.
  4. R&D costs as a percentage of revenue are excessive. Engineering and design activity is bloated with too much headcount and overhead. Yet they are late with deliverables. PLM means efficiency in R&D.
  5. The organization struggles with coordination. It appears as if manufacturing and engineering are always at odds with both departments blaming one another for mistakes. PLM can offer objective data to resolve these issues.
  6. There is no accountability in the organization. It is difficult to diagnose where mistakes were made and who is responsible. People are always blaming other departments. A PLM system can provide objective data that allows the root cause to be addressed.
  7. Expedited freight costs are bleeding away your profits. Excessive expedited freight costs are common in companies that are late with deliveries and have to ship under duress to avoid customer penalties. Better upstream engineering supported by PLM can improve this problem.
  8. Your competitors always beat you to market with new products. Is innovation management and new product introduction a problem for your organization? A better PLM system can make dramatic differences in this area.
  9. Customers complain that they do not get the information they need. You owe your customers information at various stages during the engagement cycle and they never get it in a timely manner. A suitably configured PLM system can improve this dramatically.
  10. Your suppliers provide the wrong information. This can be a common problem diagnosed by your engineering staff. But do your suppliers have the right request to begin with? PLM technology can bridge this gap

Do you have three or more of these issues keeping you up at night? Time to take a serious look at a PLM system.

Any organization managing product introduction must have an underlying project plan determining how this is going to happen. Any design process goes through various stages from initial concept to final product. A generic process is illustrated below:

Project-Deliverables-Diagram

Now, this may be simple at a concept level, but can become incredibly complex at an execution level. Consider if you had a 500-person team working on a project and all of their activities needed to be coordinated. Even more crucial to the overall success is the management of deliverables – has every participant delivered his or her contribution to the project on time and to the required quality?

This is where an integrated PLM and project management system can be a powerful tool. In such a system, engineering and design deliverables are attached to tasks in a project plan, and the associated task can only be considered complete once this has occurred. If project management is executed using a standalone system, there is no link between task and deliverable; no way of knowing for certain if what is reflected in the project plan corresponds with reality.

So as a project manager, which would you prefer?

  1. A project plan that is disconnected from the required deliverables and may or may not reflect reality.
  2. An integrated system where the project plan is tied to engineering and design deliverables.

There are several consequences of such an integrated system:

  1. Project managers can immediately see if deliverables have been fulfilled. No requirement to verbally or formally query a project participant
  2. Milestone reviews can be conducted efficiently; either the milestone deliverable is in the system or it is not.
  3. Project managers are presented with real time status reports and dashboards. As a deliverable is attached to a task, the report is updated.
  4. There is no hiding the dates for completing a task. The timestamp of when a deliverable is completed is visible for all to see.

All of this allows for what the PLM world calls “automatic” or “invisible” project governance. Projects are self-governing, with all participants being aware of status in real time.

Wouldn’t you want this kind of system?

© Tata Technologies 2009-2015. All rights reserved.