Thursday, October 2, 2014

DynaWorks is here!! The Navisworks library for Dynamo

As some of you might be aware I have been working on a Dynamo library for Navisworks, if not then you'll be pleased to learn I have been working on a Dynamo library for Navisworks :)

If you want an idea what the heck it does go HERE to see it in action.

This was actually in an almost production form a couple of months ago, however until very recently you couldn't bring .dll files into Dynamo. When zero touch came out this made it possible, however you missed out on a number of documentation features, Dynamo wouldn't remember the library and so on.

So initially my bad monkeys that I had testing this stuff had to hack apart their Dynamo builds..... This is not a good idea hence why the beta was shared with a couple of people only.

However the Dynamo team started working on it from that point with a couple of others and within 2 months, they have gotten Dynamo to a version where we can share and access such programs without any hacks.

This just goes to show how well opensource can work with an active community and a funded dev team :)

Starting to Customise Dynamo.

If your doing some simple stuff, I would defiantly check out the ZeroTouch  this is a great way to work for simple stuff, however connecting to other api's like Navisworks quickly introduced issues like having all the native Navisworks objects exposed as COMobjects in Dynamo which makes for a bad user experience. So lots of wrapping is required, for those who don't know what that means is instead of dealing with an object called InwOaPathColl it looks like NavisObjectCollection in Dynamo, so much friendlier to understand:)

For those who may not be aware Dynamo is completely opensource which means you can get in there and add your own customisation's to do all sorts of things. Initially my goal was to be able to extract clashing objects, data and views and be able to sync them in Revit. This would allow me work with large Navis projects containing alot of files, and not have to have them all loaded in all the time when clearing up clashing issues.

Now many people own or have Visual Studio Professional, however I do not yet. I have managed to get by with Visual Studio Express and Sharp Develop.

After this setup I was able to clone the Dynamo repo which you can find here or you can download it. initially you just need to start the library and it should build. However I had a couple of issues working with the free version.

At this stage you can create a project and open up the Dynamo project, add yours to theirs and you can start testing in sandbox mode. If you need Revit to debug your nodes, then you will need to plug it a different way which I am not going to cover here :)

From there the only other library you need to include is ProtoInterface and possibly ProtoGeometry if you want to work with the native or Design script engine to convert your objects.
Now Dynamo will suck up all your public classes, methods and so on automatically so your best using the following items to control what shows up in Dynamo.
In Autodesk.DesignScript.Runtime the[IsVisibleInDynamoLibrary(false)] is great at hiding methods although it seems to be buggy with properties with external reference classes and [SupressImportIntoVM] which can hide classes.

One really big advantage of building a separate library is you can run it in DynamoSandbox (if you go to Program Files, Dynamo there is a .exe in there you can run which is the Dynamo Engine) mode or standalone. I have a feeling we will see alot more standalone products using Dynamo in the future not just using Revit, however you can interact with those nodes within Revit Dynamo to achieve what your after which is great.

DynaWorks 15

Due to the way the Navisworks API is setup, atm the moment multiple versions have to be referenced separately. To download head to github and download the zip here. This is due to the fact the Navisworks implementation works in sandbox box (without Revit or Navis). The Autodesk guys have the same issue with Revit, it's just they have the advantage of hiding because you can only see the commands based on the Revit version are you are in. I am sure there is a tricky way to do this but I have not as yet looked.

PS You need Navisworks Simulate or Manage in order to use this library.

Installing the Library

Once you have downloaded the relevant library from here. Once the package manager is fixed you will be able to download the library as a package but until then you need to use Github.

Once you downloaded the files as a zip or individually you need to copy paste the following files (Depending on whether you have 2015 or 2015 Navisworks) to a folder location. It doesn't matter where as long as the following files are available.
  • DynaWorks15.dll
  • DynaWorks15.xml
  • DynaWorks15_DynamoCustomization.xml
  • Interop.NavisworksAutomationAPI12.dll
  • Interop.NavisworksIntegratedAPI12.dll
Once you have those files simply load the DynaWorks.dll ONLY the rest will all load up.

Once you have this you can get packages from the package Manager or get others yourself.

With all that aside lets get into my Navis Library.

As you can see there are currently 4 main parts to the Navisworks library.

Custom:- Where I put my Custom Nodes
Clash Detection:- Run, get tests, get results, get clashing objects.
FileSettings:- Open a Navis File, append files, Save As.
Objects:- GetNodes, search/query objects, get properties, get values, get attributes
Views:- get view names, camera positions vectors and looking points.

Now I don't want to go through everything so I thought I would point out the major stuff here.

For the moment I have built a number of custom nodes that are doing alot of the hard yards for you, so download them and check them out!! This is done for 2 reasons, first because these nodes save a ton of time in setting up what can be some really basic stuff, and 2 so you can get an understanding of how I have built the system and what elements need to interact with what.

FileSettings > OpenNavisFile
This is starting point of any use of the DynaWorks, it allows the user to open a file.
It has two options, the file path and the whether the session actually opens, this is one of the coolest features in this library as you can actually get data, run clashes and query objects without having to actually open the Navisworks UI. This saves a ton of resources in access and use of the tool.
If you do want to access it just set it to true and you can still navigate, update, save and close the file if you need to.

WARNING: Do not put this node inside a Custom Node, first it will always close the session for some reason after it runs if the node is in a custom node, this is a known issue. Second if you run multiple custom nodes or other options with it embedded it tries to open the file each time, encounters file locks etc......

Accessing Navisworks Objects (Also known as Nodes)

NOTE: Nodes are every single selectable object in a Navisworks Selection tree. Each has it's own attributes and properties, but Navisworks treats each one as a Node Instance with parents or children.

These are different from Nodes in Dynamo, I know it's annoying but to keep ease of use of API, communication with developers and so on, I am not creating new acronyms. 

There are currently a number of ways to use Dynamo to access Navisworks Nodes

First you can run a clash detection, filter through the clashes and return the objects you want. There is a fair bit happening in Clash detection so I suggest you look at the published nodes and also the examples on GitHub.

Second in Objects>NavisNodes>GetFilesInProject this will give you the starting point of the node files in each loaded nwc. You can then use the GetNodeChildren to cycle through the lists.

Last you can use any existing selection sets in Objects>NavisSelection and get a list of selectionsets manipulate and create what you need then use the GetNodesFromSelectionSets.

ClassNames vs ClassUserNames

There are two options here, both are equally valuable, one allows you to select objects by the Navisworks objectname, the second allows you to select objects by the string name data from Revit or other external package.
To know which name you need to select expand the little data readout, the first name of any node is the ClassName and the second is the UserClassName. The same thing goes for attributes.

Accessing Navisworks Attributes

From the nodes you can access the attributes, these are the various Tabs, that you see in the Properties panel. You can return attributes from nodes by using the Objects>NavisAttributes>GetNavisAttributesFromNodesList. This will return all the available attributes for those particular nodes in a list.

Accessing Navisworks Properties

The way the NavisCOMAPI library works when we want properties we have to setup a few things.
There are under Object>NavisPropertyList, the first returns a list of the available properties from the attribute.
These get the Attribute by ClassName or ClassUserName, then you need to identify the property and it will return all values as strings. I may work on numbers and other data in future, but this makes it easy for now.

Navisworks GetData

Second in objects there is GetData and NavisObjects and each has a Get Value. The option in GetData is for getting the properties and values of ANY Navis object so tests, results, the works. The other option is for getting the data from Navisworks property tabs, properties and values.

The query objects add alot of value in finding out what things are and what are in elements so feel free to use that!!

This is a great tool for querying or filtering objects by various attributes and so on!!!
It should work on every type of object in Navisworks.

Do not use this to get Property values, a special methods has to be done hence the once I have above, using this will cause it to crash or crazy behavior.

Navisworks ClashDetection

I am not going to detail all the requirements of the Clash Detection tool, just the setup.
First you need to have setup some clash tests, you can use Dynamo to Run tests and cleanup resolved issues if you want, otherwise you can simple extract the clash data.

To do this get an OpenNavisFile and connect it to ClashDetection>Create>GetClashDetection then under ClashDetection>Actions> you can get the clash tests, then the clash results by each clash or by group.

You can get various data from these XYZ points, views, properties/comments however, if you want to select the objects you must use the GetClashNodes available from the same area. Once you have the nodes you can perform other operations.

NOTE: Here is some great news, if not all your team has Navisworks Manage, they can still open the file and extract all data and clash data with Simulate!!! That way only the team members with manage need it to setup the clashes and so on. Users can access the NWF project to retrieve all the data from Simulate.

You can also run new clash tests, resolve clashing issues and so forth from Simulate!

Hopefully this stays open but who knows!!!

Navisworks Views

Lastly we can get View Data to do this we need to access the Views>GetSavedViews from the list.
You can then cycle through get comments, data camera positions etc...

In future if people want we can look at creating Navisworks views from with Revit, and updating the Navisworks file itself. for now I needed data and clash extraction.

Note: I have created a default and metres conversion. Please note this will only work if the conversion is set to feet, which so far in 2015 I can't seem to change the underlying API units. The reason for the conversion is the dynamo 3D view creation nodes are also in metres and this should save a ton of time.

Getting Started

 Examples on the github repo you can find a number of definitions to get your started, these are commented to show you what the nodes are doing and should work with DynaWorks14 and 15.

If you need both versions of Navisworks then becareful on how you manage them!!!


The following are some examples of using DynaWorks to do the following.
  • RevitClashesElementUpdate - Updates Revit elements comments to say whether something clashes from Navisworks
  • RevitClashFamilies - Places Family Instances as Clash Points
  • CreateSuitableRevitView - Creates a suitable Revit View of the family.

Final Notes

This has been a big but fun project for me in my spare time, I know there will be bugs, problems and possible feature requests all that information should be put on the GitHub repo here.

Also this is a personal project, so your gonna have to put in the effort to learn it outside of my examples, and this blog.

Also I am not support so I can't guarantee how many issues, queries or problems I will be available fix!

Sunday, July 13, 2014

Introducing the Revit Pre-Cast Panel Planner 5000

Hey Guys

I hope you are excited to read this post as I think this post is showcases a lot of cool tech working together. The exercise has taken around 2 weeks with my deadline on Monday of which I have gains skills in excel functions and a recursion in Dynamo.

We have a rather large project (4x42 storey towers, single podium) under construction at the moment that I'm helping out with site co-ordination and 4D planning.

Now to date the the transfer has finished and the first tower is about to go up. The team has decided we will be constructing the tower floors out of precast panels. We will be casting these panels onsite and the first of the construction molds have arrived on site. This is a very large and critical part of the project of which over 9000 panels will be poured and installed. There are 20 molds with various accessories in which over 90 sizes need to be created with cast in sleeves for pipes and all the rest.

Now storing all these on site needs to be managed very carefully it's a day to pour a panel, then it needs to sit for a number of days to set before being installed onsite. So critical management of the pouring schedule and storage is critical to met our installation target dates, as well as be able to react to changes to ensure we are never short.

I was tasked with developing a simulation for our precast manager of the precast site, as well as asked by the precast manager to develop a spreadsheet that would enable him to manage the panels.

So to begin with I developed a spreadsheet that would address the following goals;
  • Get the current panel types and numbers required directly from Revit.
  • Get the installation tasks and dates directly from Microsoft Project
  • Automatically calculate and develop a Just in Time pour/install sequence
  • Allow the Pre-cast manager to override the Just in Time and update automatically
  • Present all this information showing how busy each mold will be a week
  • Show by Type each panel pouring and stacking locations as well as ready or not ready
  • Automatically colour highlight errors, problems and issues for easy management.
Needless to say this spreadsheet was a pain in the ass, I actually tried a number of planning and 5D solutions first with little success. Nothing we have is really built to show consumption correctly, or you can do it at macro levels but not at the level we required. Whilst excel manages time quite well and we can obviously do all sorts of functions, my spreadsheet contains no VBA or macros just some fancy functions.

However I managed to complete the schedule and it look quite pretty here is a week by week look at a Panel type, stacking optimizer and set out..

Stacking Schedule
Orange means in storage but not set properly, green is set, installation numbers and setout all below. The entire spreadsheet is complete automated.

Once completed and the precast manager was happy that he could work with the data, I then started to try and figure out how I would actually show a week by week analysis of a pre-cast yard.
First we needed to model the yard with our anticipated requirements.

Firstly we needed a model of the yard layout, so we grabbed all the panels and dumped over the project and made it look pretty like below.

Then I went to Navisworks however the best I could make it do was a bunch of lies butcher the 5D pricing simulate whilst linking the schedule to show the total number of panels and putting an addition or subtraction option. Works great for the logical sequence of total panels poured, pretty useless for a pre-cast yard layout. Building a model that would actually show the proper number of panels, and update them to show whether they are ready or not wasn't going to happen, I then also tried Vico but again these tools are for construction install and complete or install and demolish, not install update remove install update remove etc.... I needed some parametric :)

So I had a good think and the only thing I could think of was Zach Kron's Dynamo solar optimization installer you can find out about it here.

The key thing this was doing was grabbing a Revit object manipulating it and saving/reading data from a file multiple times, which is basically what I wanted to do!!
The key was enabling Revit objects to be updated from each panel types stacking locations as well as show whether panels are ready or not ready. Now the stacking page in excel only shows one panel at a time so in addition to getting the values for a Panel I would then need to iterate through the panel types one at a time and then get the stack numbers and update the values for each stack of the panel types.

This would effectively give me a week by week view of what the precast yard will look like based on the updated Revit data, installation dates, actual to date and the pre-cast managers forecasting all in one. Plus I would be able to tag and print out if required the information!!!!
Live documentation!!

Dynamo to the rescue.... sort of.....

Before doing this exercise I considered myself an ok user of Dynamo, having given a talk at RTCAU recently however this exercise really made flex the legs of this powerful addon :)

First off I need to update my single Panel family into the stack Panel family. This involved making sure all the types were loaded in, an array and empty options where available and it had a value for the stack height. I have a quick API that updates the stack panel instance parameter with the name of the panel nested in for checking against the spreadsheet.

Upside down view showing a nested family with a simple array some formula's to control the visibility.

I started by downloading Dynamo 7.1 and getting stuck into it. I was able to pretty quickly develop something that would read a single stack and update it.

Delving into the world of dynamo recursion..

Next came the really hard part, now for those who don't know what the hell recursion is, basically it's software term for something that can call itself. If your familiar with Fibonacci sequences this is a pretty good example. This allows you to loop your Dynamo node to enable to check each stack 1 at a time rather then make 1 million nodes to counter for each variance.

To say I encountered difficulty would be an understatement, first was the dynamo examples of direct recursion simply don't work, and there's not alot of advertising that they don't work anymore. After a few hours of feeling like an idiot I emailed the Autodesk guys and they told me it's now broken has been for a while and they hope to have it back soon, however there are some clever ways to do it so go and check those out. Due to the lack of nodes, familiarity and examples in 7.1 I ended up ditching it and jumping back to 0.6.3 which had alot more examples including both the Solar Optimizer and nodes I thought I needed for input/out of excel files that were not in 7.1.

So I checked out Zac Kron's (yes Zac again :)) example on Fibonacci sequences and started to pull it all apart, completely rebuilt my 7.1 script in 0.6.3 which was no easy task doing direct translation and discovered that the AND node more works like the OR node (hint just use a formula to solve these problems in 0.6.3).

By this stage I was well into Saturday evening with my deadline and my wife both looming over me, neither extremely happy with my progress, considering I have to present everything at the  coordination meeting on Monday to the team.

So I began Sunday morning with some new energy and armed with the Fibonacci example.
In order to get all the stacks ready for the panel type I need to iterate through all the stack numbers and update the values of each family. So with some push and shove I got this working pretty easily.

Now the hard part, in order to get the panel types to update I actually need to write data from Dynamo to excel with the number of each panel so it updates the excel stacking data and then goes and gets the information. This was also difficult :) in part do to the fact that Dynamo will actually override all the other data on that sheet when it updates a single value so be EXTRA CAREFUL I was actually working on my master spreadsheet when I did this and had to rewrite a bunch of formula's and lost an hour.

The end of the tale is I managed to complete the spreadsheet, Revit Pre-cast Planner 5000 with time to have dinner and write this blog update :). With a focus on using Revit and Dynamo to enhance a number of our planning activities utilizing part of the the 4D that aren't traditional build this or build that. I think it's has the ability to have great impact on construct works and producing live documentation.

The key with this moving forward is we now have a pretty solid tool for planning precast locations on this job and future jobs with precast being more utilized. Whilst some people will use a factory, buying the molds and DIY is cheaper as long as you can find the storage this, and I hope this tool will help our teams in the future optimize our spaces, and casting sequences.

Without further ado I give you Precast Planner 5000 the movie!!!
My first ever YouTube upload.
Don't expect high quality video editing thought :)

ps. Now I have completed it, I will try and build the updated version for 7.1 and see how difficult it would be.

Thursday, June 26, 2014

GitHub and RevitAPI as well as RTCNA Wrapup

Hey Guys

I thought I would jump in with a quick post to cover API stuff and RTCNA wrap up for this year.

First off I have put all my source code from my 4 RevitAPI coding labs onto github.
It's got a bunch of starter code, I plan to get the rest of the handout, powerpoint information up today.

The plan is to get people to sign up, and post their own problems as code on Github and we can all start to work as a team. It's not designed to be ALL of your program or secret tool, but the key snippets that helps build a program up and I know we all could use those to assemble our code from assemblies of underlying stuff.

So get on board it also will help you learn to use Version Control for those who are new to coding, which is always a good idea.

So I was lucky enough to attend RTCNA2014  held in Chicago this year and I have to say wow!!!
Not only did the event sell out and hit the cap for attendees the presentations from Autodesk and users were really really top notch.

Check out Erik's blog for all the pics of which I am in there somewhere:)

Some of the examples simply blew me away with what people are trying to do and how they are getting there, and it's the willingness to share successes and failures with the broader community that is always exciting.

On top of the classes the other best part of any RTC is the networking. The conference is geared towards networking and again this one was awesome, I met a bunch of new friends from the US and the UK bumped into people who are running things with firms we are about to start working with. RTC really helps you make good connections and people you can talk to, and more and more you like going to RTC not only for the presentations but to trade war stories as each person struggles to innovate, collaborate and integrate their people, projects and companies to achieve better outcomes for everyone.

You always come back refreshed and ready to take on another year of slogging through broken Revit files, imported CAD data, PDF markups, paper and other things people secretly do when your not constantly on watch.

This really raised the bar for me and as the number of RTC events grows it becomes harder to figure out which one's you are planning to attend for both friends and content which you only get to see once a year.

For me RTC is the only international BIM technical conference I put my time and effort into. There are others but they simply don't have the culture, expertise and spirit the RTC seems to embody in everybody.

Last but not least there is always some good tech on show and this year did not disappoint with our firm becoming part of the Revizto clan and getting the software for collaboration on our designs.

I hope to see some of you at one or maybe all events next year.

Wednesday, June 4, 2014

RTCAUS quick review

I have returned from RTC and I am now back at work, nothing like getting off a plane and walking into a 6 hour site co-ordination meeting. :)

I just wanted to do quick recap of RTCAUS2014. For those unaware check out

This year the event was held in Melbourne, which was great to be back into a milder climate, I find each year the quality of the presentations and the knowledge of attendees simply jumps in leaps and bounds.What is even more interesting is much more of the focus is away from BIM Authoring technologies themselves, Although there are a still a great number of BIM software howto's and productivity talks and labs. The bigger focus and the more engaging presentations seem to be those based around workflow, government adoption, and most importantly collaboration.

These are still at the heart of what causes most of our friction in this day and age, and how to address these varies greatly in each companies political climate. But what is nice is to hear how others are doing this and looking at what methods, formats and systems to put in place to attempt to achieve not only the technological issues which are actually quite minor today in the Australian market with users, but how to now get information in front of managers of people, projects and collaboration at that higher level.

I did a talk on Revit API, of which I am doing a repeat at the RTCUS conference aswell. From there I have a couple of people interested in starting a group to share coding bits & pieces. This will appear in the future by the end of June on GitHub for people to download, upload point out solutions and so forth.

The idea of the GitHub will be code snippets for other users that can be copied and placed, it's mean to be very easy to read with lots of comments to allow new API users learn to tinker, break build and ultimately share bits and pieces of how to do things.

For now I'm working on some interesting Dynamo stuff, and preparing for RTCNA I hope to show something on in July maybe and see those attending in 2 weeks in Chicago.

Tuesday, May 6, 2014

Creating Intelligent Step Down Tags

Well I am now officially blogging from KL, after finishing my role with GHD in Manila.
My new role sees me working as the Company BIM Manager for YTL Corporation with our internal, Owners, Designers, Construction and Operations team to help leverage new technologies like BIM to centralise and streamline information for our stakeholders.

My family is settled and I am working on some exciting stuff, templates, API and already helping co-ordinate some very projects which I hope to show case at some point.

Anyway as always sitting down and start from scratch brings new perspectives with experience and here is a trick that has been around for a few releases but I have never thought of using till this morning.

So a big issue with Architects and Engineers with Revit, is that whilst you can get FFL and SSL it's hard to have step downs as they are relative to the floors they are stepping down from only.

So after planning to create the new step down symbol I thought how to make it intelligent and a thought came to me, Adaptive families.

Since all adaptive components can have multiple points and be faced  based on selection I thought why not create a simple 2 point adaptive component, a shared reporting parameter and a tag.

Here is my mockup.

First create a generic adaptive component, create an additional level by coping the level up, then place a point of each level. I then created a shared parameter that is connected in the Z axis only to the elements like so.
I drew an invisible line that is 3D point based to make selection and manipulating easier in my project.

From there is was a matter of creating a tag and referencing the shared parameter as a label.

After this it is connecting the step down adaptive component to the two faces of the floors and tag the element from which you will get the reporting step down's and it will update automatically when the slabs/finishes/step downs change.

Also if the floors or finishes are removed it will automatically delete the element and tag so be sure to rehost if you need to host to a different element.

I will be at RTC in a few weeks and I hope to see many of you there.

Wednesday, March 26, 2014

Point Cloud beginer tips

A project I am working on at the moment is a complex Hotel that is requiring a complete refurb on an extremely tight deadline.

This project is located overseas from my office here and I have been collating and managing the team to capture the existing conditions, we are modeling the entire structure in Revit.

To speed the process we have been getting the whole building point cloud scanned, and so whilst I have dabbled in LiDAR and point clouds on a few projects this is my first project in which we are dealing with them on a day to day basis, and a large volume of data aswell.

So to begin with I have been using ReCAP, which is pretty simple to use view, cut/crop and then export the clouds for various uses.

However before that I needed the data in a usable format, due to the surveyor initially providing us raw data which was useless, I managed to find a text file that told me the make of scanner Leica and used that to determine what data the default technology can export, and asked for that from our surveyor.

To date we have been asking and dealing with pts files, however I have since been told (Thanks Brett from Autodesk) that ptg is a better format as it's ascii readable and therefore there are less translations woe's so in future this is the format we will be requesting.

Our second issue has been converting the exported files to the Autodesk recap/scan files, as these files can be anywhere from a couple of gig to 150gb this can take a long time, the first couple of scans were being copied across to our server and then converted with the copy/conversion process taking up to a day for the larger scans.

We have since setup a computer in the project office and I remote in to that, as the conversion from the point file to Autodesk scan file can drop the size of a file by as much as 70%. We then copy the converted file across enabling our team instant access to the new data once it has been copied. We did have an issue with the graphics card not being able to run the right version of OpenGL in a remote desktop session, so in this case we needed another computer in order to get access to ReCAP.

The first thing I recommend when using recap is to set your temporary folder to somewhere where you have ALOT of hdd space. Our conversions of large scans say for instance a 45gb required an additional 65gb of conversion space to convert the file to the Autodesk format.

Our office standard is small SSD C drives and with all the other software I only have about 40gb spare at the moment so change this in your settings.

I will post some more key things in the future, but the major ones I have learned are;
Change your temporary folder locations for conversion
If you need to move cloud file between offices, convert them first to lower transfer times.
Ask for ptg files for readability
Using Remote Desktop services may be an issue if your card doesn't support at least OpenGL 3.1 for remote services.

Otherwise I have had no issues with locations or grid co-ordinates when loading files into Revit, I will provide additional updates on tips & tricks.

Monday, February 17, 2014

CAD Links Project File

I have noticed lots of people struggle with Xref CAD Linked files on Revit Servers.

This can be due to a number of things different folder location setups for each office, getting Consultant or Client required drawings, in which you need to link to your Revit model.

The easiest way for me to do this is by creating a new Revit Model called CADLinks or something similar.

Once this project is created Copy/Monitor Levels and Grids from your actual Project.

Then for floor, elevation, section views simply setup multiple views for each level for CAD Link as required then load them in, ensuring that the Links are in the correct positions with the Grids/Levels.
I normally make sure each link is then set to Foreground in the views. This will ensure that the CAD file overlays ontop of the floors or other model elements in the Project.

Once this is setup I activate worksharing and save the file to the Revit Server.

Then in my main Project file I load this file in onto it's own Workset and then for any views that require a Linked view for any reason you can just setup the View Settings with the Linked By View option and the correct CAD Linked view selected.

This way if other offices work on the Work in progress file there is no double of CAD Files, plus in your Master file there are no linked CAD files at all, which means no issues with CAD files being duplicated, or loaded into the distance.
Other issues I've encountered is bad CAD files or Objects in CAD files causing problems in project files, cleaning up the amount of links in your direct Project File.
This also speeds up reference loading since all the data for the linked views is stored in the Revit File Memory so it doesn't need to load 100's of references if your consultants have been using CAD or you are working with large civil datasets and drawings.

Tuesday, January 21, 2014

VEO 1.7.0 Update

Latest update of VEO is out, and damn the models are flying, they seem to really focus on making things faster and faster as well as improving on existing features as well as adding new ones.

Some great new features like measure and a status bar type UI element for allowing quick access to items and keeping track of your sync'ing and projects.

An important thing to note, for those outside North America we can access the Track module, this basically allows people out on site to scan barcodes and it will select various models, documents, or go to views based on the requirements back in the office for communication of issues and other problems.

Now this has been disabled by default however if you contact m-six support they will activate your account with this feature, if you are allowed to use it in your region at last check North America was unable to use this function.

Kudos to the Team, and looking forward to seeing more great enhancements to old features, speed and the addition of new features!!!

Tuesday, January 14, 2014

Conversion of Large Site Datasets for Visualisation

As an engineering firm with a very large Civil/Water team we do a lot of work with LiDAR surfaces and the like. These objects tend to be very top heavy although having them in their entirety in our rendering presentations certainly goes a long way to achieving not only a more realistic presentation for our clients, but as a interactive model for our engineers.

We use a number of platforms for our civil works and road designs due to our client requirements and simply some tools are better then others or some users prefer a tool and are good at their jobs, on any typical day in our office we have teams on 12D, Autodesk Civil3D, Bentley MX and/or Geopak.

Fortunately Autodesk acquired a tool a few years ago that imports all these tool sets called Civil View(Used to be Dynamite), built into 3DS Max Design as of 2014. Prior versions it was an addon bundled with the software.

Anyway as string/alignment designs (strings/alignments are a very important tool civil designer use to create roads, kerbs, drainage from a single polyline connected to an assembly, for Revit people think of sweeps that could auto cut/fill topography, as well as create the object) paths and surfaces can be complex sometimes the routines do not always play that nicely with bringing data in, other times it can simply be the size of the datasets. As a importation tool though for us it really does a great job to quickly create very large Civil Visualisations and Conceptual Options for stakeholders who are typically council, government related, and add visual enhancements like vehicles, road signage and that sort of thing.

String/feature lines in 3DS Max are only require if you want a line to layout objects like vehicles, traffic lights, signage etc.... otherwise they are not doing a huge amount for your file.

3ds Max is very stable with very large geometric datasets, and I've yet to create something for visualisation so unwieldy it was unable to handle the data, I should point out we use a GPU rendering tool and Max is our file container and manager of complex and large scene data, we don't use the rendering tools in it, only the animation and scene setups.

An example is the following of which I have been able to get the road designs and PAD's across but the software was unable to convert the main LiDAR surface from Civil3D.

In these case the surface wouldn't come across and the warnings in the export from Civil3D tell me that aswell. If you are guiding a Civil Engineer through this process make sure they tell you if there are any errors in the export to save any hassle with trying to figure out why Max isn't working, when it's not the issue.

As with any failures it's time to drop to the next attempt at importation and that's using landXML. LandXML is a pretty good format for exchanging data, however like IFC or DXF it creates large files that describe geometry and not design intent, most Civil programs work in similar fashion and the conversion is generally pretty good except for things like super elevations, assemblies and the like.

Anyway so I exported the LiDAR surface only, this generated a 1.5gig file. Now bringing this into 3ds Max takes a long time so make sure this is done as an overnight activity to not tie up resources. The final Max file was around 450mg and runs relatively very quick.

In this scenario I used the Export to Civil View for 3ds max tool, the other import tools in Max require the right settings from 12D or MX but there is no need for custom formats or exports like Civil3D. For 12D simply ask your civil designers for the main files they should be saved as ASCII format already, or depending on how they built their models you might want them to combine it all in a single export. 12D files handle large datasets for better then Civil3D without having to split files and create additional CAD management so in many cases our largest projects are done in 12D.

If you need complex scene animations, 4D, Road analysis or phasing your best to discuss with your Civil engineer what you are trying to achieve and if they are agreeable you can get them to quickly build a number of joined and split surfaces both existing and during/post construction to get the surfaces you need.

The end results really make a big difference on large civil projects with little time investment.

Sunday, January 5, 2014

Revit Server 2013 Model Corruption!!

Hey Guys

Happy New Year and I hope all enjoyed their holidays.

Here is a quick issue we have encountered and never knew about, if anyone is deleting models from Revit Server 2013 whilst any other users on ANY OTHER project on the server are synchronising within a couple of minutes of the Host server replicating the deletion changes with the Accelerators then the model they are working on will become corrupt.

So make sure any cleanup of the Revit Server 2013 is done outside of hours when people aren't working.

PS Revit Server 2014 does not seem to have this issue.