Category "Product Lifecycle Management"

256px-caught_between_a_rock_and_a_hard_placeThere they were, sailing along their merry way. Toward the horizon, a narrow strait approaches. As the boat gets closer, they notice a couple of strange characteristics; to one side a cliff and the other a whirlpool. Upon arrival, it becomes apparent that this is the cliff where the monster Scylla dwells. Looking to the other side, the monster Charybdis, spewing out huge amounts of water, causing deadly whirlpools. Each monster is close enough that to avoid one means meeting the other. Determined to get through, our intrepid hero Ulysses must make a decision.  The idiom “Between Scylla and Charybdis” comes from this story.  In more modern terms, we would translate this to “the lesser of two evils.”

PLM administrators, engineering managers, and IT teams are often give this same choice with equally deadly – well, unfortunate – outcomes. What is this dilemma? Customize the PLM system (beyond mere configuration) to match company policies and processes, or change the culture to bend to the limitations posed by “out of the box” configurations.

Companies will often say something to the effect of “We need the system to do X.” To which many vendors meekly reply “Well, it can’t exactly do X, but it’s close.” So what is a decisionmaker to do? Trust that their organization can adapt? Risking lost productivity and possibly mutiny? Or respond by asking “What will it take to get it to do X?” incurring the risk of additional cost and implementation time.
source-code-583537_1280

We can further elaborate on the risks of each.  When initially developing the customizations, there is the risk of what I call “vision mismatch.”  To the best ability, X is described with a full understanding of the bigger picture that is missed when the developer writes up the specification.  This leads to multiple revisions of the code and frustrations on both sides of the table.  Then, customizations have the longer-term risk of “locking” into a specific version.  While gaining the benefits of keeping your processes perfectly intact, the system is stuck in time unless the customizations are upgraded in parallel.  Some companies will avoid that by never upgrading…until their hardware, operating systems, or underlying software systems become unsupported and obsolete. Then the whole thing can come to a crashing halt.  Hope the backups work!

office-1209640_1280However, not customizing has its own risks. What if the new PLM system is replacing an older “homegrown” system that automated some processes that the new system cannot? (And a “homegrown” system comes with its own set of risks; original coder leaves the company, never commented code, no specifications, etc.)  For example, raising an issue automatically created an engineering change request while starting a CAPA process. The company has gained a manual process, thus exposing them to human error. Or, perhaps the company has policy that requires change orders go through a “four-eyes” approval process, to which the new system has no mechanism to support such a use case.

Customizing is akin to Charybdis, whom Ulysses avoided, deciding that it is better to knowingly lose a few crew members rather than risk losing the entire ship to the whirlpools. Not customizing  is more like Scylla, where there is lower risk, though a much higher probability to the point of almost certainty.

We’ve been through these straits and lived.  We’ve gone through with many companies, from large multi-nationals to the proverbial “ma and pa” shops.  Let us help you navigate the dangers with our PLM Analytics benchmark.

When we talk with customers that may have a need to enhance their PLM technology or methods, there are commonly two different schools of thought regarding the subject.  Generally companies start the conversation with one of two different focuses: either CAD-focused or process-focused.

CAD-centric companies are the ones who rely heavily on design and engineering work to support their business.  They generate a lot of CAD data, and eventually this CAD data becomes a real pain to manage effectively with manual processes.  Things get lost, data is hard to locate, design reuse is only marginally successful, and the release process has a lower level of confidence.  These companies usually start thinking about PLM because they need to get their CAD data under control.  They usually start PLM with a minimal approach that is just sufficient to tackle the obvious problem of CAD data management.  Sometimes other areas of PLM are discussed, but are “planned” for a later phase, which inevitably turns into a “much later” phase which still hasn’t happened. What they have done is grease the squeaky wheel while ignoring the corroding frame that is potentially a much bigger problem. CAD-centric companies often benefit from taking step back to look at their processes; many times they will find that is where the biggest problems lie.

BOMs are often associated with CAD geometry, but many times this isn't the case.

BOMs are often associated with CAD geometry, but many times this isn’t the case.

Companies that don’t deal with a lot of CAD data can often realize the benefits of PLM from a process improvement perspective. Product introductions, project management, BOM management, customer requirements, change management, and quality management are just some areas that PLM can help improve. Many process-focused companies already have systems in place to address these topics, but they are often not optimized, and usually not connected.  They tend to be their own individual silos of work or information, which slows the overall “get to market” process, and reduces the overall effectiveness of the business.  These companies might not have the obvious “squeaky wheel” of CAD to manage, but they have PLM challenges just the same.  The key to improvement with them is to identify the challenges and actually do something about them.

In either case, Tata Technologies has the people and processes to help identify and quantify your company’s biggest challenges through our PLM Analytics process.  This process was developed specifically to address the challenges companies have in identifying and quantifying areas for PLM improvement.  If you’re interested in better identifying areas of improvement for your company’s PLM process, just let us know.  We’re here to help.

 

ilogic-snipSometimes CAD can be used to start establishing PLM practices. Since PLM systems rely on data to be effective, ensuring consistent and correctly-entered information is paramount. Things like classification with properties and meta-data can rely on CAD very heavily to be effectively used. For example, let’s consider the classification and data for a machined part. If the part is going to require machining, we could assign it a classification of “Machined.” Since the part is going to be machined, we would want to ensure that “Stock Size” is one piece of meta-data to be tracked. Most CAD systems have a way to ensure this “Stock Size” is at least filled out, and some could even be automated to calculate the stock size without any user intervention. Of course a repeatable logic would need to be utilized, but once that is done, time spent completing stock size calculations and potential errors would be eliminated.

 

Case in point: Utilize iLogic in Autodesk Inventor to calculate stock size for machined parts. Once this is done, users can forget about manually checking all the measurements; all they need to do is flag the part as “Machined” and the system does the rest!

boltvolume

It’s that time of year in the Siemens PLM world.  Siemens NX 11 was released in late August, and new functionality and streamlined tools always gets me excited. This year marks the 10th year I’ve been using Siemens products, which I realize is a short time in our industries, but it does make me feel a bit nostalgic about my introduction to them.design_1

10 years ago, I graduated college with my B.S. in Mechanical Engineering, and gained my first job as a Project Engineer with a company that produced high-end carbon fiber performance parts.  A friend of mine, with whom I had gone to college, was also starting there at around the same time, and when I asked him which design software we would be using, he informed me it would be a product called NX. At that time, not being overly well-versed on all the options for CAD in the marketplace, I was not familiar with Siemens NX and worried that I was about to become experienced in a piece of software that wasn’t widely used.  As I said, I was not very well aware of the true marketplace!

We started on NX 2, and it would be new software for the company so, as the young engineers, we were to prove out what it was really capable of.  From the very beginning, I took to the software much quicker than I ever had when using PTC or Works while in school. NX offered not only ease of use but powerful design tools I had never had access to before. Since we were a manufacturing shop as well, we picked up NX CAM to program our NC Mills and Lathes to produce fixtures and tooling used to create our parts.  Once again, new software, new capability, but nobody knew it, so it fell again to us to learn another part of the software. Eventually, we also procured Femap to do Static and dynamic load analyses on our composite layups to ensure part strength and durability (we were creating carbon fiber prosthetic ankles at that time that had to cycle through millions of steps over the course of a month to pass quality requirements).  So within a year, I had come to know the CAD, CAM, and CAE side of Siemens applications quite well, and I continued to learn and grow with the software during my years there.

design_2

Fast forward 10 years, a few jobs, and countless projects and experiences with Siemens products, and I still find myself impressed.  I remember when Synchronous Technology was first released, and the impact it had on the industry.  I remember year after year of functionality improvement, GUI improvement, dialog improvement, system stability and capability improvements.  I remember the advancement of freeform tools, and the “wows” as users and prospective users found ways to do their jobs they had never seen before.  The Siemens product line itself has continued to grow and become more diverse over that time, delving into every aspect of modern product design, from industrial styling to noise and vibration analyses. Siemens’ acquisitions of industry-leading software companies, and the integration of those technologies into their flagship products, have positioned them as a world leader in digital engineering, digital manufacturing, and Product Lifecycle Management.

I feel lucky that I have been able to touch so many different aspects of the software over the last 10 years, and I am always amazed at the improvements that come with each and every release.

Siemens PLM continues their long history of creating the most powerful and flexible design software in the world today. And as for NX 11, I covered some of the most exciting new features and functionalities in a webinar we hosted just last month. Missed my presentation the first time around? Click here to watch it on demand.

This quote embodies a phenomenon in today’s entertainment media culture: the Alternate Reality Game, or ARG. These ARGs aim to bring the audience deeper into the product experience, to rev up the hype. The game’s producers build an elaborate web of interconnected “plot points,” for lack of a better term. These plot points are not overt, not communicated outright. The audience has to dig for it by picking up clues and solving puzzles that lead to the next, and so on.

A recent entry into the world of ARGs is Blizzard Entertainment’s new game, Overwatch – a game whose players are very interested in information about the next character planned for release. It started with a list of character names on a piece of paper seen in the game, with one exception. That started the rumors that this could be the next character to be revealed. Next, Blizzard released a couple of videos on YouTube where a seemingly innocuous blip (below left), actually turned out to be an image; the colors had to be enhanced to reveal a series of numbers (below right). A member of the community converted the hex values to ASCII, and XOR’ed them with the number 23, converting the result back to hexadecimal and mapping to readable letters…which turned out to be a phrase written in Spanish.

combined

IFphkqCAnother clue puzzle was solved when a similar strange anomaly appeared in another video. It was a series of images that looked like the color patterns you used to see on a TV station late at night after they stopped broadcasting. One of the other images was a series of horizontal black and white lines. One player took those lines, turned them 90 degrees, and read them via a bar code reader. The result was, of course, more hex values, which were converted into binary. Through some form of applied computer science, taking the binary code into pixels where 1s were black and 0s were white ultimately revealed a QR code. What did the code reveal? “Was that easy? Now that I have your attention, let’s make things more difficult.” As of the writing of this blog, the ARG is still alive and well, with more pieces being revealed regularly.

Where Does PLM Come In?

09 0C 0B 0E 0B 17 04 12 19 0B 11 06 19 03 12 17 18 03 02 19 18 18 15 04 05So why is this story in a PLM Insights blog? Well, I’ve seen many companies treat their design and engineering data like an ARG – meaning that lots of people are interested in it, it’s all over the place, and only a few (smart and creative) people really know how to find it. Whether it’s the “smart” part naming scheme from the late 80s that needs some kind of secret decoder ring, or the folder naming conventions where names serve as some sort of obscure meta data for the files contained in them.

An example part file (the names and numbers have been changed to protect the innocent):

S:\22648 COMP-LOGISTICS TAILCROSSER BG-F813\Units Designed By ABC\22648-9399 OP 30 Backup ASM Fixture reworked (same as 22648-9899) See XYZ Folder for new Op80 design\OP80 reworked 4-16-08 same as 22648-9899\Libraries\OP30\Purchase\Stndrd\fd65645_2

Here we have used just about every bit of the Windows character limit (which is 260, for those interested: 3 for the volume designation, 256 for path and file name with extension, and a null terminating character). Anyone that can manage files this way is clearly a talented individual. Much more impressive is that they were part of a 23-person design team that did all of their projects this way. I couldn’t imagine the frustrations of their new hires trying to find anything in this kind of environment.

The benefits of searching for data are pretty clear (see Google). Yet to this day, companies are still using antiquated methods because they think it’s “cheaper” than implementing a PLM system. Our PLM Analytics benchmark and impact analyses have proven otherwise, and that doesn’t include the myriad other benefits a PLM system offers. Let us know if you’re done playing the engineering and design ARG!

FYI, there is no ARG in this post…or is there?

Among technology practitioners, there is no shortage of pundits offering predictions into the future and where the next big wave is going to hit. The reason for this is that the stakes are high – a correct forecast of future technology trends can literally be worth billions.

So what are the current predictions talking about? Here is a sampling of the current buzz:

  1. Big Data
  2. Social Media
  3. Crowd Sourcing
  4. Social Computing
  5. Mobile Connectivity

So how does this impact PLM? Traditionally PLM is conducted on internal infrastructure in secured environments using traditional devices. For example, an average designer concerned with the creation of a 3D CAD data would be working on a company workstation behind a firewall. Equally, an engineer creating BOM data would be using a secured client installed on his company laptop. The possibility exists that the engineer may take his laptop home and interact with the PLM system via a VPN but this is probably the extent of “mobility.”

Returning to the technology buzz, consider the potential impact of two trends – mobile connectivity and social computing. Consider the following scenarios:

  1. Your newly recruited engineer has transitioned his digital life to his tablet and no longer uses a laptop. (hence the title of this piece)
  2. The VP of Engineering wants to query the status of his product introduction using his mobile phone.
  3. Your company wants immediate access to customer feedback on existing products so that this can be translated into requirements for new or updated designs.

Given the traditional model sketched our earlier, implementing anything close to these scenarios is almost impossible. The infrastructure, mindset, and processes will not support mobile connectivity from alternative devices nor allow general access to a requirements gathering front end. Also, it raises a whole lot of questions around data security, use of private devices, and non-company access. While the technology to achieve these scenarios probably exists, it would require considerable financial and effort investment to make it happen.

This leads to the fundamental risk investment equation. It may be possible to construct a business case that justifies the outlay. At a high level, two possibilities exist:

  1. Traditional PLM infrastructure is good enough for at least the next ten years and can be re-evaluated then.
  2. Changing the way business is conducted is a do or die activity and this includes PLM.

An informal survey of small to medium size companies shows that most participants have not even considered these technology trends. In part, there appears to be no business imperative and in part because there are other more attractive avenues for immediate investment.

So, do you want your engineers to be doing all their engineering work on a tablet anywhere in the world?

You have a PLM system. Fundamental to this system is the concept of a version and a revision. However, this is probably the most misunderstood process in the PLM realm. Also these terms mean a wide variety of things to different people and are often used interchangeably and without consistency.

For the purposes of the rest of this piece, we will use the following definitions:

Version – represents a small incremental change in the design that would be saved in the database. Versions are not necessarily saved permanently beyond a revision.

Revision – represents a significant event in the design process and is saved permanently in the database for reference throughout the design process.

Diagrammatically, the difference is illustrated below:

Version Revision

It is often confusing to talk to about this subject because the terms are used interchangeably. Also, the distinction between a version and a revision is not clearly understood, even to the extent that participants think that they are the same thing. Because of this, it is important that any organization with a PLM system ensure that all the participants clearly understand the definition and what the difference is.

In a collaborative PLM environment, participants are very dependent on viewing or using data generated by other participants. For example, a headlamp engineer needs the position of locating holes in the sheetmetal to be able to design his locating pins (if this is the order of precedence). In this scenario, the headlamp engineer will say “I need the latest sheetmetal to begin my design.” This statement is common in design and engineering teams. However, it is inherently imprecise because it begs the question: Do you need the latest version or latest revision?

Based on the definition given earlier, what is really required is the latest revision. A version is a work in progress and could be incomplete or half-done because the responsible author may be in the middle of a redesign or new concept. For this reason, a version should not be visible to the larger organization; only revisions should be accessible, as they satisfy the definition of “best so far.” This concept is very difficult to get across to a lot of people and represents the conundrum referred to in the title. It takes some courage to work on data that will change sometime in the future, but this is absolutely required in an efficient design process.

The version/revision conundrum also leads to some interesting human psychology. Consider any collaborative design environment where multiple participants have to submit data into a PLM system to progress a large project. It is important in these environments that all participants follow the mantra of “publish early, publish often” or, per the nomenclature of this piece, create many revisions. This is based on the principle that incomplete or slightly inaccurate data is better than no data at all.

However, process managers often put in systems that highlight inaccuracies or incomplete data, effectively punishing early publishers. So data authors hold back and only create revisions when they are certain of accuracy, late in the process. This is counterproductive.

So pay attention to the version/revision conundrum; clear definitions and policies of this simple issue can greatly improve a PLM process.

 

Fundamental to any PLM system is the idea of Access Control and data security. Only authorized personnel can access a PLM system and view or manipulate its contents. This is controlled via a login procedure that includes a user password. Personnel are added to the list of authorized users by the PLM administrator after someone has approved of their specific access rights.

Once access has been granted to users, it must then be determined what operations they can carry out on the PLM system. The simplest (and default) security model which allows all users to carry out any operation is very undesirable and could lead to actions that can destroy or leak vital data.

This scenario requires the development of a Security model which determines which user can carry out which operations. Security models are normally based on two concepts:

  1. Roles
  2. Organizations

A role in the database would define what the user who is assigned that role is allowed to do. Typical roles are as follows:

  1. Viewer – this role would be allowed to view data but not make any alterations or modifications
  2. Team Member – this role would be allowed to alter and update a limited subset of the data along with being able to carry out certain operations (e.g. initiate a workflow)
  3. Team Leader – this role would be able to do everything that a Team Member could do along with the ability to operate on a larger subset of data and carry out more operations (e.g. progress a workflow, change ownership)
  4. Approver – this role would be able to approve certain operations on the data (e.g. approve a release of information)
  5. Database Admin – normally limited to a handful of technically qualified people

Once roles in a database have been defined, the organizations are put in place. These normally mirror actual organizational structure, although this is not a necessity. Organizations in a PLM system usually work on specific projects or programs. Once the organization is defined, users are allocated to various organizations and are assigned specific roles.

The final result can be represented in a table as follows:

Within Organization Outside Organization
User Role View Modify Approve View Modify  
John Doe Team Leader Y Y N Y N
Paul Revere Team Member Y Y N N N
David Earp Approver Y N Y Y N

So how is security set up in your PLM system? Are all the security capabilities been used to ensure that no intellectual property is destroyed or leaked?

Back in the day…

There it was, one of the first internet communities, Usenet, about to undergo a sea-change unlike any it had seen before. It was 1993, September, a month that would never end.

IT - Ethernet Cable outletIt started much like the years had before; an influx of new people coming into the universities, getting online for the first time. The community absorbed them in much the same manner as they had in the past. These first-timers were indoctrinated with the well-established etiquette and protocols that were required to thrive in this brave new world.

It seems archaic now, but back then, in the “before times”, there was no way for mass discussion; social media had not yet been born.

The plot twist

And then it happened. AOL, then a name synonymous with the internet, decided to grant access to Usenet for all of its customers. Picture the mobs that gather outside department stores the morning after Thanksgiving: the unlocking of the door let loose a mass of people that overwhelmed the community. There were just not enough graceful souls able to help coach these new users in “civilized” net behavior. Social norms were thrashed; standards went out the window. It was the equivalent of the wild, wild west. In a word, it was chaos.

Future looking

Misc-Walking-peopleNow think of how you on-board new designers or engineers. You show them who’s helpful and who to avoid. You show them around, pointing out places of interest, teach them company standards, design methodologies, workflow processes, etc. Over the coming decade (to be exact, 2014 through 2024), according to stats provided by the Bureau of Labor Statistics (BLS), the Architecture and Engineering field will grow an average of 3.4%, or about 710,000 jobs.

The biggest (projected) job gainers:

  • Civil – 106,700
  • Mechanical – 102,500
  • Industrial – 72,800
  • Electrical – 41,100

Manufacturing - SuspensionCouple this with the BLS projection of labor force participation over the same time period where we’ll see a 1:1.3 ratio of people leaving the work force to people entering. That will be a lot of churn, meaning a lot of people to on-board. The products will be ever more complicated, and the enabling technology will be as well. Technology is cited as one of the reasons the field isn’t growing as fast as other areas.  The productivity gains in PLM are making companies more efficient, even as the complexity grows.

Conclusion

Business - Chess pawn inverseCompanies will need a strategy for managing changes in their employee base as well as the technology evolution. We offer a series of benchmarking and analysis services called PLM Analytics, and there is one specifically aimed at this issue called PLM Support. Let us know if we can help solve your Eternal September.

Any organization managing product introduction must have an underlying project plan determining how this is going to happen. Any design process goes through various stages from initial concept to final product. A generic process is illustrated below:

Project-Deliverables-Diagram

Now, this may be simple at a concept level, but can become incredibly complex at an execution level. Consider if you had a 500-person team working on a project and all of their activities needed to be coordinated. Even more crucial to the overall success is the management of deliverables – has every participant delivered his or her contribution to the project on time and to the required quality?

This is where an integrated PLM and project management system can be a powerful tool. In such a system, engineering and design deliverables are attached to tasks in a project plan, and the associated task can only be considered complete once this has occurred. If project management is executed using a standalone system, there is no link between task and deliverable; no way of knowing for certain if what is reflected in the project plan corresponds with reality.

So as a project manager, which would you prefer?

  1. A project plan that is disconnected from the required deliverables and may or may not reflect reality.
  2. An integrated system where the project plan is tied to engineering and design deliverables.

There are several consequences of such an integrated system:

  1. Project managers can immediately see if deliverables have been fulfilled. No requirement to verbally or formally query a project participant
  2. Milestone reviews can be conducted efficiently; either the milestone deliverable is in the system or it is not.
  3. Project managers are presented with real time status reports and dashboards. As a deliverable is attached to a task, the report is updated.
  4. There is no hiding the dates for completing a task. The timestamp of when a deliverable is completed is visible for all to see.

All of this allows for what the PLM world calls “automatic” or “invisible” project governance. Projects are self-governing, with all participants being aware of status in real time.

Wouldn’t you want this kind of system?

© Tata Technologies 2009-2015. All rights reserved.