The AutoCAD 2017 Essentials for New Users online training course will teach new users on how to effectively get started with AutoCAD 2017. After completing this course, you will be able to navigate the interface and perform basic commands, create basic drawings, manipulate objects, organize drawings and inquiry commands, alter objects, work with layouts, annotate drawings, work with dimensioning, hatching objects, work with reusable content, create additional drawings objects, plot your drawings, and create drawing templates. This course is also good for experienced users to freshen up on their AutoCAD skills plus used in our Tata Technologies classroom sessions and used to prepare students for the AutoCAD certifications.

The training is available in one comprehensive course AutoCAD 2017 New User Essentials or broken into smaller courses in  the AutoCAD 2017 Essentials Learning Path.

Subscribers to i GET IT Basic and Professional Subscriptions have access to this material automatically. For plan information visit https://www.myigetit.com/plans.

It’s that time of year in the Siemens PLM world.  Siemens NX 11 was released in late August, and new functionality and streamlined tools always gets me excited. This year marks the 10th year I’ve been using Siemens products, which I realize is a short time in our industries, but it does make me feel a bit nostalgic about my introduction to them.design_1

10 years ago, I graduated college with my B.S. in Mechanical Engineering, and gained my first job as a Project Engineer with a company that produced high-end carbon fiber performance parts.  A friend of mine, with whom I had gone to college, was also starting there at around the same time, and when I asked him which design software we would be using, he informed me it would be a product called NX. At that time, not being overly well-versed on all the options for CAD in the marketplace, I was not familiar with Siemens NX and worried that I was about to become experienced in a piece of software that wasn’t widely used.  As I said, I was not very well aware of the true marketplace!

We started on NX 2, and it would be new software for the company so, as the young engineers, we were to prove out what it was really capable of.  From the very beginning, I took to the software much quicker than I ever had when using PTC or Works while in school. NX offered not only ease of use but powerful design tools I had never had access to before. Since we were a manufacturing shop as well, we picked up NX CAM to program our NC Mills and Lathes to produce fixtures and tooling used to create our parts.  Once again, new software, new capability, but nobody knew it, so it fell again to us to learn another part of the software. Eventually, we also procured Femap to do Static and dynamic load analyses on our composite layups to ensure part strength and durability (we were creating carbon fiber prosthetic ankles at that time that had to cycle through millions of steps over the course of a month to pass quality requirements).  So within a year, I had come to know the CAD, CAM, and CAE side of Siemens applications quite well, and I continued to learn and grow with the software during my years there.

design_2

Fast forward 10 years, a few jobs, and countless projects and experiences with Siemens products, and I still find myself impressed.  I remember when Synchronous Technology was first released, and the impact it had on the industry.  I remember year after year of functionality improvement, GUI improvement, dialog improvement, system stability and capability improvements.  I remember the advancement of freeform tools, and the “wows” as users and prospective users found ways to do their jobs they had never seen before.  The Siemens product line itself has continued to grow and become more diverse over that time, delving into every aspect of modern product design, from industrial styling to noise and vibration analyses. Siemens’ acquisitions of industry-leading software companies, and the integration of those technologies into their flagship products, have positioned them as a world leader in digital engineering, digital manufacturing, and Product Lifecycle Management.

I feel lucky that I have been able to touch so many different aspects of the software over the last 10 years, and I am always amazed at the improvements that come with each and every release.

Siemens PLM continues their long history of creating the most powerful and flexible design software in the world today. And as for NX 11, I covered some of the most exciting new features and functionalities in a webinar we hosted just last month. Missed my presentation the first time around? Click here to watch it on demand.

ilogic-iterateA while back, I was visiting a customer with an interesting design challenge. They happened to be a specialty fastener manufacturer, and a big part of their design work includes the development of the part geometry (and associated tooling dies) as it goes through the forging operations to produce the final part. Just imagine that every change of the component from one forming operation to the next must maintain the same part volume. If the bolt’s head is shortened, then it must also increase in diameter to maintain the same volume. When making a bunch of design changes, you can only imagine how many attempts must be made at changing parameters to get the volume correct.

Since this customer is using Autodesk Inventor, there is an automation environment called iLogic that can be used to solve this challenge. With a bit of minor customization in iLogic, an iterative process can be developed to automatically adjust one parameter when another changes.

The following code could be adapted in iLogic to satisfy many similar situations:

Parameter.UpdateAfterChange = False
Dim CurrentVolume As Double
Dim VolumeDelta As Double
Dim OldVolume As Double
Dim PercentChange As Double
'reset the percent to a high value so the routine runs
PercentChange = 10

If HeadDepthChange <> 0 Then
    OldVolume = CDbl(iProperties.Volume)
    HeadDepth = HeadDepth - HeadDepthChange
    'iterate until volume nearly matches
    While  Abs(PercentChange) > .00000000001
        RuleParametersOutput()
        InventorVb.DocumentUpdate()
        ThisApplication.ActiveView.Update()
        
        CurrentVolume = CDbl(iProperties.Volume)
        VolumeDelta = OldVolume-CurrentVolume
        Percentchange = VolumeDelta / OldVolume
        HeadDia = HeadDia + HeadDia*PercentChange/2
    
    End While

    CurrentVolume = CDbl(iProperties.Volume)
    VolumeDelta = OldVolume-CurrentVolume
    Percentchange = VolumeDelta / OldVolume
'    MessageBox.Show(PercentChange , "Final Percent of Change")
    MessageBox.Show("Original Volume = " & OldVolume & "  New Volume = " & CurrentVolume & "  Volume Difference = " & VolumeDelta, "Volume Change")
    
    HeadDepthChange = 0
End If

This quote embodies a phenomenon in today’s entertainment media culture: the Alternate Reality Game, or ARG. These ARGs aim to bring the audience deeper into the product experience, to rev up the hype. The game’s producers build an elaborate web of interconnected “plot points,” for lack of a better term. These plot points are not overt, not communicated outright. The audience has to dig for it by picking up clues and solving puzzles that lead to the next, and so on.

A recent entry into the world of ARGs is Blizzard Entertainment’s new game, Overwatch – a game whose players are very interested in information about the next character planned for release. It started with a list of character names on a piece of paper seen in the game, with one exception. That started the rumors that this could be the next character to be revealed. Next, Blizzard released a couple of videos on YouTube where a seemingly innocuous blip (below left), actually turned out to be an image; the colors had to be enhanced to reveal a series of numbers (below right). A member of the community converted the hex values to ASCII, and XOR’ed them with the number 23, converting the result back to hexadecimal and mapping to readable letters…which turned out to be a phrase written in Spanish.

combined

IFphkqCAnother clue puzzle was solved when a similar strange anomaly appeared in another video. It was a series of images that looked like the color patterns you used to see on a TV station late at night after they stopped broadcasting. One of the other images was a series of horizontal black and white lines. One player took those lines, turned them 90 degrees, and read them via a bar code reader. The result was, of course, more hex values, which were converted into binary. Through some form of applied computer science, taking the binary code into pixels where 1s were black and 0s were white ultimately revealed a QR code. What did the code reveal? “Was that easy? Now that I have your attention, let’s make things more difficult.” As of the writing of this blog, the ARG is still alive and well, with more pieces being revealed regularly.

Where Does PLM Come In?

09 0C 0B 0E 0B 17 04 12 19 0B 11 06 19 03 12 17 18 03 02 19 18 18 15 04 05So why is this story in a PLM Insights blog? Well, I’ve seen many companies treat their design and engineering data like an ARG – meaning that lots of people are interested in it, it’s all over the place, and only a few (smart and creative) people really know how to find it. Whether it’s the “smart” part naming scheme from the late 80s that needs some kind of secret decoder ring, or the folder naming conventions where names serve as some sort of obscure meta data for the files contained in them.

An example part file (the names and numbers have been changed to protect the innocent):

S:\22648 COMP-LOGISTICS TAILCROSSER BG-F813\Units Designed By ABC\22648-9399 OP 30 Backup ASM Fixture reworked (same as 22648-9899) See XYZ Folder for new Op80 design\OP80 reworked 4-16-08 same as 22648-9899\Libraries\OP30\Purchase\Stndrd\fd65645_2

Here we have used just about every bit of the Windows character limit (which is 260, for those interested: 3 for the volume designation, 256 for path and file name with extension, and a null terminating character). Anyone that can manage files this way is clearly a talented individual. Much more impressive is that they were part of a 23-person design team that did all of their projects this way. I couldn’t imagine the frustrations of their new hires trying to find anything in this kind of environment.

The benefits of searching for data are pretty clear (see Google). Yet to this day, companies are still using antiquated methods because they think it’s “cheaper” than implementing a PLM system. Our PLM Analytics benchmark and impact analyses have proven otherwise, and that doesn’t include the myriad other benefits a PLM system offers. Let us know if you’re done playing the engineering and design ARG!

FYI, there is no ARG in this post…or is there?

Among technology practitioners, there is no shortage of pundits offering predictions into the future and where the next big wave is going to hit. The reason for this is that the stakes are high – a correct forecast of future technology trends can literally be worth billions.

So what are the current predictions talking about? Here is a sampling of the current buzz:

  1. Big Data
  2. Social Media
  3. Crowd Sourcing
  4. Social Computing
  5. Mobile Connectivity

So how does this impact PLM? Traditionally PLM is conducted on internal infrastructure in secured environments using traditional devices. For example, an average designer concerned with the creation of a 3D CAD data would be working on a company workstation behind a firewall. Equally, an engineer creating BOM data would be using a secured client installed on his company laptop. The possibility exists that the engineer may take his laptop home and interact with the PLM system via a VPN but this is probably the extent of “mobility.”

Returning to the technology buzz, consider the potential impact of two trends – mobile connectivity and social computing. Consider the following scenarios:

  1. Your newly recruited engineer has transitioned his digital life to his tablet and no longer uses a laptop. (hence the title of this piece)
  2. The VP of Engineering wants to query the status of his product introduction using his mobile phone.
  3. Your company wants immediate access to customer feedback on existing products so that this can be translated into requirements for new or updated designs.

Given the traditional model sketched our earlier, implementing anything close to these scenarios is almost impossible. The infrastructure, mindset, and processes will not support mobile connectivity from alternative devices nor allow general access to a requirements gathering front end. Also, it raises a whole lot of questions around data security, use of private devices, and non-company access. While the technology to achieve these scenarios probably exists, it would require considerable financial and effort investment to make it happen.

This leads to the fundamental risk investment equation. It may be possible to construct a business case that justifies the outlay. At a high level, two possibilities exist:

  1. Traditional PLM infrastructure is good enough for at least the next ten years and can be re-evaluated then.
  2. Changing the way business is conducted is a do or die activity and this includes PLM.

An informal survey of small to medium size companies shows that most participants have not even considered these technology trends. In part, there appears to be no business imperative and in part because there are other more attractive avenues for immediate investment.

So, do you want your engineers to be doing all their engineering work on a tablet anywhere in the world?

In today’s engineering environment, there are a plethora of design tools available. One question I often hear is “Why CATIA?” It’s a question that seems simple enough, but the answer is much more complex. CATIA generally involved a greater initial investment, but in terms of overall design cost, you may be surprised to learn that a CATIA license can be a real bargain.

Ask: “What are we trying to accomplish?”

What type of design work are you doing? Do you require the ability to create complex surfaces? Are you going to create a small number of models and small assemblies or will there be a large number of models and large assemblies? Are you sharing the models with customers or vendors? Do you start every design from scratch or reuse as much data as possible?

The list of questions above is certainly not complete, but you can see by the number of questions already posited, the answer is multifaceted.

Complex Surfacing

Let’s look at the creation of complex surfaces. Many CAD systems can create surfaces of some level, but what if your company needs to create complex shapes? Look at how many CAD systems can create complex surfaces, and the list gets shorter – much shorter. Next, how many systems can modify complex surfaces? One example of this is the actual morphing of a complex surface. One might use this ability to compensate for springback in a metal stamping or counteract warpage in a plastic part. Now the list is much shorter. CATIA can easily handle these operations.

RSO

Large Assemblies

Next let’s look at large assemblies – something on the order of 500-1,000+ models. While virtually all systems can create assemblies, what happens when these assemblies get very large? Can the system handle them? How are you going to manage these assemblies? Is the system still able to operate or has its performance degraded to the point that it is virtually unusable? CATIA can handle very large assemblies, entire automobiles, aircraft, ships, etc. With CATIA V6 the management of these models is OOTB. Again the list is short at this point.

Data Reuse

Lastly, let’s look at data reuse. […]

With globalization and distributed product development, adhering to a single CAD tool for the globally dispersed design teams and suppliers has become practically difficult. Business dynamics like mergers, acquisitions, partnerships, and flexible supplier selections based on direct material sourcing processes can also present a multi-CAD scenario to companies. In such scenarios, if the PLM tool isn’t capable of supporting multi-CAD, that can limit the overall engineering business flexibility of companies. It can either force them to take up costly, error-prone, risky, and time-consuming CAD platform migrations OR the global teams continue to work in isolation using a variety of different CAD tools, data, and processes. During that period, each design group generates and stores design data independently, lacking mechanisms to work as an integrated whole. Time-consuming and error-prone processes for finding and managing CAD data and assembling release packages cause design decisions to be often based on incorrect/out-of-date information. This results in design delays, as users can’t see each other’s design changes immediately, and inconsistent adherence to changes and approval processes.

Teamcenter, with its ability to manage more than one CAD system on the same platform, helps companies to mitigate both present and future multi-CAD challenges. It has out of the box integration to all major commercial MCAD tools like NX, Solid Edge, CATIA, SolidWorks, ProEngineer/Creo, Inventor and AutoCAD. Teamcenter’s CAD integration and data management capabilities are very rich, including embedded TC ribbon, standalone TC integrated CAD tool launch, advanced search, update/save to TC, update/synchronize data being worked on by others, change impact analysis, initiate changes, and save revisions – all directly from CAD without the need to open the Teamcenter application. Along with these functionalities, Teamcenter’s 4-tier architecture, optimized for high latency environments, multi-site, security and supplier integration solutions, helps global engineering teams to operate and collaborate at different levels of business integration, from a tightly integrated process mode to a loosely coupled on demand mode.multi_cad

In a multi-CAD environment, design groups receive designs from other teams and/or suppliers, created using different CAD tools. They have difficulty in aggregating CAD data from multiple CAD sources to visualize and analyze assemblies. Also, BOM structures aren’t adequately connected to visual content for Digital Mock-Up. This lack of connection undermines the timeliness and quality of decision making, and forces them to spend time and cost to aggregate, review, and validate designs and design changes. So it is key to enable designers to visualize and analyze product data from different CAD systems and conduct design collaboration reviews across geographically distributed sites. Teamcenter, with its industry leading visualization capabilities, provides the ability to visualize multi-CAD in a neutral JT format and then simulate various assembly modes for downstream engineering, manufacturing and service processes.

All of this leads to:

  • Improved productivity: Enable design teams and suppliers to use the tools they are most familiar with and use/reuse component designs created by other teams/suppliers on other MCAD systems
  • Accelerated product development: Find the right design information quickly; structured workflows enable development groups to work together as a single entity irrespective of location.
  • Increased quality: Find the right data and understand the dependencies to intelligently assess the impact of changes
  • Reduced costs: Modify and share component designs created by other teams/suppliers on your preferred CAD systems and incorporate it into multi-CAD assemblies or product design

Do you have any thoughts to add? Questions on how Teamcenter might apply to your design environment? Leave a comment and let’s chat.

You have a PLM system. Fundamental to this system is the concept of a version and a revision. However, this is probably the most misunderstood process in the PLM realm. Also these terms mean a wide variety of things to different people and are often used interchangeably and without consistency.

For the purposes of the rest of this piece, we will use the following definitions:

Version – represents a small incremental change in the design that would be saved in the database. Versions are not necessarily saved permanently beyond a revision.

Revision – represents a significant event in the design process and is saved permanently in the database for reference throughout the design process.

Diagrammatically, the difference is illustrated below:

Version Revision

It is often confusing to talk to about this subject because the terms are used interchangeably. Also, the distinction between a version and a revision is not clearly understood, even to the extent that participants think that they are the same thing. Because of this, it is important that any organization with a PLM system ensure that all the participants clearly understand the definition and what the difference is.

In a collaborative PLM environment, participants are very dependent on viewing or using data generated by other participants. For example, a headlamp engineer needs the position of locating holes in the sheetmetal to be able to design his locating pins (if this is the order of precedence). In this scenario, the headlamp engineer will say “I need the latest sheetmetal to begin my design.” This statement is common in design and engineering teams. However, it is inherently imprecise because it begs the question: Do you need the latest version or latest revision?

Based on the definition given earlier, what is really required is the latest revision. A version is a work in progress and could be incomplete or half-done because the responsible author may be in the middle of a redesign or new concept. For this reason, a version should not be visible to the larger organization; only revisions should be accessible, as they satisfy the definition of “best so far.” This concept is very difficult to get across to a lot of people and represents the conundrum referred to in the title. It takes some courage to work on data that will change sometime in the future, but this is absolutely required in an efficient design process.

The version/revision conundrum also leads to some interesting human psychology. Consider any collaborative design environment where multiple participants have to submit data into a PLM system to progress a large project. It is important in these environments that all participants follow the mantra of “publish early, publish often” or, per the nomenclature of this piece, create many revisions. This is based on the principle that incomplete or slightly inaccurate data is better than no data at all.

However, process managers often put in systems that highlight inaccuracies or incomplete data, effectively punishing early publishers. So data authors hold back and only create revisions when they are certain of accuracy, late in the process. This is counterproductive.

So pay attention to the version/revision conundrum; clear definitions and policies of this simple issue can greatly improve a PLM process.

 

When I think of the countless customers I have consulted with over the years, it amazes me how many don’t use parameters to control the design and capture design intent! What is a parameter, you ask?  A parameter can be thought of in two ways when it comes to CATIA V5. Parameters are built the moment you start a new part – as you can see in the image below, we already have parameters for the Part Number, Nomenclature, Revision, Product Description, and Definition created automatically. Parameters are being created each time you build any feature.  These types of parameters are known as system parameters.

new_part_parameters

You can and should build your own parameters to define your design intent. It’s every bit as important during the initial stages of a design to define your intent this way as it is to make sure sketches are constrained properly. In fact, it helps you in your sketch constraints (every constraint is a feature that has parameters associated to it). In this simple example of a piece of standard rectangular tubing shown below, there are constraints defining the height, width, wall thickness, and radii. Even though this is very easy to create, if I am a designer I would want to design it in such a way that I never have to waste any time designing a piece of rectangular tubing again. If I am a design leader, I feel the same and don’t want any of my designers doing this again in any design that involves any piece of rectangular tubing. The use of parameters will get us there!

RECTANGLUAR TUBING SKETCH

 

The parameters I am talking about are user defined parameters. Simple to create but very, very powerful in their functionality.  The simplest way to create a user defined parameter in CATIA V5 is through the fx icon found on the Knowledge toolbar.

knowledge_toolbar

You might be thinking, where have I seen that icon before? Oh yeah, in Excel when I need to create a formula for my cell. That is the point we are making here! In Excel, I use this function to compute things for me and make it easy to come up with a desired result.  In CATIA, we will create some parameters and then, when necessary, assign formulas to them to come up with our desired result.  When you click on the icon, you get the Formulas dialog and when you click on the drop down list next to the New Parameter of Type button, you can see you have many, many options.

new_parameters_types

[…]

A Bill of Material (BOM) at its core is a very simple concept: a list of components needed to manufacture a finished product. So if one was making a pair of spectacles, the BOM may look as follows:

Finished Product Spectacles Quantity
Item 1 Right Lens 1
Item 2 Left Lens 1
Item 3 Frame 1
Item 4 Hinge 2

It must be said that understanding how a BOM functions is fundamental to understanding how PLM systems work. This simple list is really at the core of the PLM system. However, simple concepts have a tendency to escalate into very complex subjects. And so it is with a BOM.

One of the complexities associated with a BOM is that an organization usually has a requirement for different types of a BOM in order to define a single product. Most manufacturing companies have at least three types:

  1. EBOM (Engineering BOM) is the list of parts that engineers are responsible for and comprises all the components that require some sort of design input.
  2. MBOM (Manufacturing BOM) is the list of parts that are required to actually make the product. This is typically different from EBOM by components that engineering do not specifically design (glue strips, liquid fills etc.). It may also be plant specific.
  3. XBOM (Service BOM) is an as built list of parts used in a product that actually made it off the factory floor. This may be different from what was originally specified by the MBOM because of crisis during manufacture. It is important from a customer service perspective.

So the question is: how are your three BOMs authored, edited, maintained, and released? Whatever the answer to this question, the outcome is always the same:

  1. No BOM – No product
  2. Wrong BOM – Factory rework or customer dissatisfaction.

An informal survey of small to medium size companies yields surprising results: Excel is the predominant BOM management tool in an engineering environment. Manufacturing BOMs are normally handled by some sort of ERP system and service BOMs are poorly tracked, if at all. This situation is fraught with potential for disaster because of all the manual processes that have to occur before an actual product gets made.

Hence the analogy in the title. BOM management may be a hidden problem that is set to explode in an organization, especially as the products being made become more complex. PLM systems can offer a single organized BOM that represents all the different types in a consistent, controlled manner. Given the potential consequences of the bomb exploding, BOM in PLM should be a priority.

Do you have a BOM management disaster of your own to share? How about a BOM management triumph?

© Tata Technologies 2009-2015. All rights reserved.