Data management Archives - AEC Magazine https://aecmag.com/data-management/ Technology for the product lifecycle Tue, 15 Apr 2025 10:39:13 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.2 https://aecmag.com/wp-content/uploads/2021/02/cropped-aec-favicon-32x32.png Data management Archives - AEC Magazine https://aecmag.com/data-management/ 32 32 AI: Information Integrity https://aecmag.com/ai/ai-information-integrity/ https://aecmag.com/ai/ai-information-integrity/#disqus_thread Wed, 16 Apr 2025 05:00:12 +0000 https://aecmag.com/?p=23410 How to harness the power of LLMs without losing sight of critical thinking

The post AI: Information Integrity appeared first on AEC Magazine.

]]>
As AI reshapes how we engage with information, Emma Hooper, head of information management strategy at RLB Digital, explores how we can refine large language models to improve accuracy, reduce bias, and uphold data integrity — without losing the essential human skill of critical thinking

In a world where AI is becoming an increasingly integral part of our everyday lives, the potential benefits are immense. However, as someone with a background in technology — having spent my career producing, managing or thinking about information — I continue to contemplate how AI will alter our relationship with information and how the integrity and quality of data will be managed.

Understanding LLMs

AI is a broad field focused on simulating human intelligence, enabling machines to learn from examples and apply this learning to new situations. As we delve deeper into its sub-types, we become more detached from the inner workings of these models, and the statistical patterns they use become increasingly complex. This is particularly relevant with large language models (LLMs), which generate new content based on training data and user instructions (prompts).

A large language model (LLM) uses a transformer model, that is a specific type of neural network. These models learn patterns and connections from words or phrases, so the more examples they are fed, the more accurate they become. Consequently, they require vast amounts of data and significant computational power, which puts considerable pressure on the environment. These models power tools such as ChatGPT, Gemini, and Claude.


Find this article plus many more in the March / April 2025 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The case of DeepSeek-R1

DeepSeek-R1 which has recently been in the news, demonstrates how constraints can drive innovation through good old-fashioned problem-solving. This open-source LLM uses rule-based reinforcement learning, making it cheaper and less compute-intensive to train compared to more established models.

However, since it is an LLM it still faces limitations in output quality. However, when it comes to accuracy, LLMs are statistical models that operate based on probabilities. Therefore, their responses are limited to what they’ve been trained on. They perform well when operating within their dataset, but if there are gaps or they go out of scope, inaccuracies or hallucinations can occur.

Inaccurate information is problematic when reliability is crucial, but trust in quality isn’t the only issue. General LLMs are trained on internet content, but much domain-specific knowledge isn’t captured online or is behind downloads/paywalls, so we’re missing out on a significant chunk of knowledge.

Training LLMs: the built environment

Training LLMs is resource-intensive and requires vast amounts of data. However, data sharing in the built environment is limited, and ownership is often debated. This raises several questions in my mind: Where does the training data come from? Do trainers have permission to use it? How can organisations ensure their models’ outputs are interoperable? Are SMEs disadvantaged due to limited data access? How can we reduce bias from proprietary terminology and data structures? Will the vast variation hinder the ability to spot correct patterns?

With my information manager hat on, without proper application and understanding it’s not just rubbish in and rubbish out, it’s rubbish out on a huge scale that is all artificial and completely overwhelms us.

How do we improve the use of LLMs?

There are techniques such as Retrieval Augmented Generation (RAG), that use vector databases to retrieve relevant information from a specific knowledge base. This information is used within the LLM prompt to provide outputs that are much more relevant and up to date. Having more control over the knowledge base ensures the sources are known and reliable.

This leads to an improvement, but the machine still doesn’t fully understand what it’s being asked. By introducing more context and meaning, we might achieve better outputs. This is where returning to information science and using knowledge graphs can help.

A knowledge graph is a collection of interlinked descriptions of things or concepts. It uses a graph-structured data model within a database to create connections – a web of facts. These graphs link many ideas into a cohesive whole, allowing computers to understand real world relationships much more quickly. They are underpinned by ontologies, which provide a domain-focused framework to give formal meaning. This meaning, or semantics, is key. The ontology organises information by defining relationships and concepts to help with reasoning and inference.

Knowledge graphs enhance the RAG process by providing structured information with defined relationships, creating more context-enriched prompts. Organisations across various industries are exploring how to integrate knowledge graphs into their enterprise data strategies. So much so they even made it onto the Gartner Hype Cycle on the slope of enlightenment.

The need for critical thinking

From an industry perspective, semantics is not just where the magic lies for AI; it is also crucial for sorting out the information chaos in the industry. The tools discussed can improve LLMs, but the results still depend on a backbone of good information management. This includes having strategies in place to ensure information meets the needs of its original purpose and implementing strong assurance processes to provide governance.

Therefore, before we move too far ahead, I believe it’s crucial for the industry to return to the theory and roots of information science. By understanding this, we can lay strong foundations that all stakeholders can work from, providing a common starting point and a sound base to meet AI halfway and derive the most value from it.

Above all it’s important to not lose sight that this begins and ends with people and one of the greatest things we can ever do is to think critically and keep questioning!

The post AI: Information Integrity appeared first on AEC Magazine.

]]>
https://aecmag.com/ai/ai-information-integrity/feed/ 0
Future of AEC Software: Special Report https://aecmag.com/bim/future-of-aec-software-special-report/ https://aecmag.com/bim/future-of-aec-software-special-report/#disqus_thread Mon, 22 Jul 2024 17:00:42 +0000 https://aecmag.com/?p=20962 This must read report details what the AEC industry wants from future design tools

The post Future of AEC Software: Special Report appeared first on AEC Magazine.

]]>
What the AEC industry wants from future design tools

Written by Aaron Perry, Head of Digital Design at Allford Hall Monaghan Morris, and Andy Watts, director of design technology at Grimshaw


This must read report details what the AEC industry wants from future design tools, covering everything from data framework, context and scale, responsible design, and modular construction, to user experience, modelling capabilities, automation, intelligence, deliverables and more.



Watch the NXT DEV presentations from Aaron Perry and Andy Watts

NXT DEV 2023 – watch the video on NXTAEC.com

Aaron Perry, talking on behalf of a collective of medium-to-large AEC firms, gives a masterful presentation as he introduces the ‘Future Design Software Specification’.


NXT DEV 2024 – watch the video on NXTAEC.com

Andy Watts gives an important update on the specification, then hands over to Allister Lewis, ADDD, to talk about benchmarking software against the specification.


The post Future of AEC Software: Special Report appeared first on AEC Magazine.

]]>
https://aecmag.com/bim/future-of-aec-software-special-report/feed/ 0
BHoM – addressing the interoperability challenge https://aecmag.com/data-management/bhom-addressing-the-interoperability-challenge/ https://aecmag.com/data-management/bhom-addressing-the-interoperability-challenge/#disqus_thread Tue, 03 Dec 2024 08:00:39 +0000 https://aecmag.com/?p=21988 This computational development project allows AEC teams to improve project collaboration and foster standardisation

The post BHoM – addressing the interoperability challenge appeared first on AEC Magazine.

]]>
The BHoM computational development project allows AEC teams to improve project collaboration, foster standardisation and develop advanced computational workflows, as Buro Happold’s Giorgio Albieri and Christopher Short explain

In Buro Happold’s structural engineering team, we’re constantly working on unique and challenging projects, from towering skyscrapers to expansive stadiums, intricate museums to impressive bridges.

Our approach is all about exploring multiple options, conducting detailed analyses, and generating 3D and BIM models to bring these projects to life. But this process comes with the major challenge of interoperability – the ability of different systems to exchange information.

Since we collaborate with multiple disciplines and design teams from all over the world, we regularly deal with data from various sources and formats, which can be a real challenge to manage.

The AEC industry often deals with this by creating ad-hoc tools as and when the need arises (such as complex spreadsheets or macros). But these tools often end up being one-offs, used by only a small group so we end up reinventing the wheel again and again.

This is where the BHoM (Buildings and Habitats object Model) comes into play, a powerful open-source collaborative computational development project for the built environment supported by Buro Happold.

BHoM helps improve collaboration, foster standardisation and develop advanced computational workflows. Thanks to its central common language, it makes it possible to interoperate between many different programs.

Instead of creating translators between every possible combination of software applications, we just need to write one single translator between BHoM and a target software, to then connect to all the others.

One-to-one connection approach between software packages (top) vs direct connection to BHoM centralised software-agnostic environment (above) highlighting current collection of main BHoM adapters

The solution: The BHoM

The BHoM consists of a collection of schemas, functionalities and conversions with the following three main characteristics:

• It attempts to unify the “shape” of the data

• It is crafted as software-agnostic

• It is open source so that everyone can contribute and use it

Currently, the BHoM has over 1,200 object models with an extendable data dictionary and adapters to over 30 different software packages.

With the BHoM, we’ve refined and enhanced our approach to structural design.

Once the architectural model is received, using the BHoM we can quickly and precisely build several Finite Element Analysis (FEA) structural models for conducting structural analyses.

It’s possible to clean and rationalise the original geometries for specific purposes and assign/update attributes to all objects based on the results of both design and coordination with other disciplines.

Finally, the BIM model of the structure can be generated in an algorithmic manner.


BHoM
Algorithm for the computation and documentation of the connection forces with textual and graphical outputs

BHoM in practice

It’s often thought that computational and parametric design is only applicable to the very early stage of a project that relies on very complex geometry.

The reality is, computational design is greatly beneficial at every stage: from the conceptual feasibility study to the detailed design of steel connections.

At Buro Happold, we use the BHoM to help us address multiple stages throughout a project, as demonstrated in the following case study examples which focus on the re-development of a desalination plant in Saudi Arabia into a huge museum.


Find this article plus many more in the Nov / Dec 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Modelling the existing and the new

Let’s see how a computational workflow applies to the modelling and analysis of existing structures making use of the BHoM.

For the Saudi Arabian project, all we had was a set of scanned PDF drawings of the existing structures.

Within a couple of months, we had to build accurate Finite Element Models for each of them and run several feasibility studies against the new proposed loadings.

A parametric approach was vital. Therefore, we developed a computational workflow that allowed us to create the geometric models of all the built assets in Rhino via Grasshopper by tracing the PDF drawings, assigning them with metadata and pushing them via BHoM into Robot to carry out preliminary analyses and design checks.

Of course, there’s no need to mention how much time and effort this approach has saved us compared to a more traditional workflow.

Moving on to the next stage of the project, we needed to test very quickly many different options for the proposed structures, by modifying grids, floor heights, beams and column arrangements, as well as playing with the geometry of arched trusses and trussed mega-portals.

Again, going for a computational approach was the only way to face the challenge and we developed a large-scale algorithm in Grasshopper.

By pulling data from a live database in Excel and making use of an in-house library of clusters and textual scripts, this algorithm was able to leverage the capabilities of the BHoM to model the building parametrically in Rhino, push it to Robot for the FEA and finally generate the BIM model in Revit – all in a single parametric workflow.

Managing data flow: BIM – FEA

As we move into later stages of the project, the more we can see how computational workflows are not only beneficial for geometry generation but also for data management and design calculations.

At Stage 03 and 04 we needed to be able to transfer and modify very quickly all the huge sets of metadata assigned to any asset within our BIM models while being able to test them on a design perspective in Finite Element Software.

Again, we developed an algorithm in Grasshopper leveraging the BHoM to allow for this circular data flow from BIM to analysis software – Revit and ETABS in this instance.

This made it possible to test and update all our models quickly and precisely, notwithstanding the sheer amount of data involved.

Interdisciplinary coordination

As usual, when moving forward in the project, coordination with MEP engineers starts to ramp up and when structures are big and complex, it becomes even more difficult. The challenge we had to face was intimidating. We had eight concrete cores, 45m tall, more than 9,000 Mechanical, Electrical and Plumbing (MEP) assets for the building and around 1,500 builderswork openings to be provided in the core walls to allow them to pass through.


BHoM
Graphical representation of the algorithm for the automated creation of builderswork openings in the concrete cores of the building

On top of this, we had the need to specify openings of different sizes depending on different requirements based on the type of MEP asset, as well as the need to group and cluster openings based on their relative distance and other design criteria.

Again, a high level of complexity and a huge amount of data to deal with. Indeed, a computational approach was needed.

Using Grasshopper, BHoM and Rhino. Inside Revit, we developed an algorithm, graphically represented below.


BHoM
Flowchart of typical BHoM-based computational workflow on projects

Through grouping operations, model laundry algorithms and the parametric modelling of the builderswork openings, we were able to generate parametrically the BIM model of the cores provided with the required builderswork penetrations.

In parallel with this, the algorithm also generated the corresponding FE model of the core walls, so the structural feasibility of the penetrations could be checked before incorporating them in Revit.

The algorithm detected the intersections between pipes and walls, then generated openings around each intersection of different size and colour depending on different input criteria. Then, using a fine-tuned grouping algorithm, it clustered and rationalised them into bigger openings, wrapping all of them together based on user-input criteria.

Finally, after testing the openings in the Finite Element software, the algorithm pushed them into Revit as Wall Hosted Families and a live connection between the Rhino and the Revit environment streamlined any update process in parallel.

Producing large data sets

Moving even further into detailed design, the amount of data to deal with on a project of such scale becomes more and more overwhelming.

This is what we had to face when dealing with the design of the connections. Although the design was subcontracted to another office, we faced the challenge of providing all the connection design forces in a consistent and comprehensive format, both in textual and graphical contexts.

Indeed, this is not an easy task, especially when dealing with around 35,000 connections, 60 load combinations, 2,000 different frame inclinations, six design forces per connection and spanning over three different finite element software packages (ETABS, Robot, and Oasys GSA).

We had to deal with 12.6 million pieces of data and we had to do it very quickly, being able to update them on the fly. Again, a computational workflow was required.

Via Grasshopper and the BHoM, we developed an algorithm to extract, post-process and format the connection forces from the Finite Element models of all the assets of the project, serialise them in JSON, save them in properly formatted Excel files and show them graphically in corresponding Rhino 3D models via tagging and attributes assignment.

All this information was sent out for the design to be carried out by other parties.

Conclusions

Applying a specialised approach, relying on algorithmic methodology and leveraging state-of-the art computational tools, such as the BHoM, enable us, at Buro Happold, to deliver comprehensive and advanced structural solutions, ensuring efficiency, sustainability, and optimal performance across all the stages of the project.

Resources

[1] BHoM Documentation (2024)

[2] LOMBARDI, Alessio, (2023), Interoperability Challenges. Exploring Trends, Patterns, Practices and Possible Futures for Enhanced Collaboration and Efficiency in the AEC Industry, in, London, UK.

[3] ELSHANI, Diellza, STAAB, Steffen, WORTMANN, Thomas (2022), Towards Better Co-Design with Disciplinary Ontologies: Review and Evaluation of Data Interoperability in the AEC Industry, in LDAC 2022: 10th Linked Data in Architecture and Construction Workshop, Hersonissos, Greece.

The post BHoM – addressing the interoperability challenge appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/bhom-addressing-the-interoperability-challenge/feed/ 0
Dalux ‘Hygge’ https://aecmag.com/data-management/dalux-hygge/ https://aecmag.com/data-management/dalux-hygge/#disqus_thread Tue, 03 Dec 2024 08:00:54 +0000 https://aecmag.com/?p=21921 Dalux has built a broad platform to liberate design and construction data

The post Dalux ‘Hygge’ appeared first on AEC Magazine.

]]>
The history of Common Data Environments (CDEs) has been long, with many twists and turns. CDEs were necessary because BIM tools made huge files and developed deep silos to inhibit collaboration. The CDE developers who survived have gone on to build broad platforms to liberate design and construction data far and wide. Dalux is one of those firms, as Martyn Day reports

Established in 2005, Dalux is a Danish software firm which has created a digital platform for almost everything outside of BIM authoring tools. It focusses on information management, design management, model validation, tendering, site inspections, and snagging, for construction firms, developers and consultants. Dalux’s software expands through the lifecycle to data handover and facilities management. The company is also scaling up into infrastructure and GIS.

Dalux started off creating what it claims to be the fastest BIM model viewing tool, being first to apply games technology to BIM geometry, an early entry into what is now the Common Data Environment (CDE) market. From that initial product Dalux has built a whole platform around its centralised data model, expanding to mobile and augmented reality.

Dalux now has a global user base of over 1 million professionals across 147 countries. Despite its scope and reach, the company is very much headquartered in Copenhagen, Denmark, which is the centre of operations and development.

The company has an annual user meeting, the Dalux Summit, which is hosted in Copenhagen. This year AEC Magazine attended to delve deeper into the products and the community. With over 1,800 attendees, the scale was much larger than we anticipated and the whole vibe was a unique experience.


Find this article plus many more in the Nov / Dec 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Dalux feels like a family business and the dialogue and interactions between customers and the team gave the distinct impression that development of features and capabilities was a much more interactive process than at other software companies. Dalux has ‘Hygge’, a Danish word that roughly translates to ‘cosiness.’

The family business is run by two brothers, Torben and Bent Dalgaard. Torben is the CEO and Bent is the CTO. In their morning address to attendees, one slide caught the zeitgeist perfectly – the brothers reassured the audience that Dalux is an independent software firm, that has no loans, no investors and is owned by Bent & Torben.

While many 19-year old firms that have been growing 60% annually for almost a decade would be wafting share value, revenue or profit as an essential part of their mainstage moment, they opted to reassure customers that, unlike firms with shareholders that are repeat plunderers of their customers’ design technology budgets, Dalux is not in that game. There are very few AEC software companies with this attitude that come to mind – the most notable others being McNeel (Rhino) and Qonic.

Dalux
Dalux has built a whole platform around its centralised data model, expanding to mobile and augmented reality

The Dalux product family

At the moment, Dalux offers nine products, which it has grouped in information management, onsite management and facility management, with almost half of its brands in on-site management.

BIM Viewer is free and works on desktop and mobile. It supports native BIM, IFC and drawings, with a range of free plug-ins for Revit, Solibri, Archicad, Navisworks and Tekla. It offers a suite of tools including measure, filter, properties, and make sections. Comments can be added, clashes from Solibri and Navisworks can be imported. We suspect that this is the gateway drug to the Dalux ecosystem!

Box is the core collaboration and CDE platform that delivers BIM geometry and data to collaborating project teams. It is accessed via the web browsers of supported mobile devices (iOS or Android).


Dalux

Having extracted the data from the BIM authoring tool, Box centralises all the project information in managed folders for design and construction teams to view, review and approve 2D and 3D data with individual team controls. Additionally, Box offers the ‘always requested’ clash detection, for both hard and soft clashes, as well as perform other geometry checks, such as point clouds from as-built.

Dalux Box Sync will download folders and files between the web and a local computer. It will also upload any files you want or have changed, making them available to other project participants.

Field is the product for quality control, health and safety, snagging/punch list and on-site reality capture. It brings the latest drawings to site and assists in scheduling and managing site inspections with customisable checklists. While onsite observation/ snags, health and safety reports can be quickly created and documented with the phone’s camera, the system is smart enough to know where in the site you are located – time and floor. There are workflow tools to trigger actions to those who need to resolve remedial work. The reports are accessible to project workers and issues clearly identified on the latest drawings.

Field Basic is a free punch list tool that supports drawings and BIM models and enables tasks, collaboration amongst defined groups, and sign-off.


Dalux


Field Sitewalk enables the quick capture of a site using video from a helmet-mounted 360 degree camera. The video frames make photographing the site effortless. These are automatically mapped in the Dalux system. Teams back in the office can use the system to see the current state of construction and even compare the site against the BIM model to see if the work is on track. The system offers some very clever registration between the rooms captured and generating the same view from the BIM geometry.


Dalux
Mapping the walks in Field Sitewalk, which enables the quick capture of a site using video from a helmet-mounted 360 degree camera

Infrafield is Dalux spreading its wings into the world of projects that span tens of kilometres, rather than metres with individual buildings. Given Dalux’s client list, we can well understand how Infrastructure became inevitable.

Infrafield required a new modelling engine technology to provide the expansive co-ordinate system. It supports 2D and 3D, Google Maps 3D tiles, drawings, GIS layers, terrain layers, and point clouds. Like ‘Field’ it can be used to track progress and capture issues. Users can create sections and cuts, measurements. It is seamlessly integrated into the Dalux ecosystem, so infrastructure models can be shared.


Dalux Infrafield
Dalux Infrafield

FM – facilities management – is probably another no brainer for following the design and construction data, into operations. It’s quite refreshing to not have to deal with the branding bludgeon that is digital twin. FM is a web and mobile content management system for 2D and 3D asset management, operations and maintenance. It combines floorplans, mapping and modelling based on location, aiding navigation. It offers a helpdesk ticketing system, work order generation, maintenance schedule and is a conduit for additional digitised documents, asset information, photos etc.

Again, the smart application uses GPS to position the user in floorplans and can be used in conjunction with QR codes for asset tagging or room tagging. The system comes with workflow tools to route tickets to the right department or person.

Handover is the Dalux product for packaging up and handing over design, construction and associated project information post build. Using templates, Handover can save a lot of time making sure the right information is used for FM downstream. It can output PDF reports and COBie files.

Tender is the secure app for distributing tenders on projects through Dalux and integrates to Dalux Field. Tender bids come with ready packaged up documents in a logical folder structure. The project owner remains in control and Dalux provides a full audit trail of any changes.

Conclusion

While US giants Procore and Autodesk Construction Cloud look to dominate the flow of data among construction and subcontractor firms, Dalux appears to be a European equivalent that is holding its own. However, the Dalgaard brothers have managed to keep the firm accessible to its customers and build a unique relationship.

As I understand, firms pay fees based on project size, as opposed to by number of users, meaning Dalux becomes the single source of truth for the construction data for all participants.

Dalux appears very support-centric, and it prioritises ongoing connection with developers and product champions in their customer base. It’s another reason why 1,800 people would visit Copenhagen to meet up with what felt more like a bespoke outsourced software developer, than a firm trying to meet next quarter’s targets.


Main image: In their morning address to attendees the Dalgaard brothers caught the zeitgeist perfectly

The post Dalux ‘Hygge’ appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/dalux-hygge/feed/ 0
Autodesk charts its AI future https://aecmag.com/ai/autodesk-charts-its-ai-future/ https://aecmag.com/ai/autodesk-charts-its-ai-future/#disqus_thread Sat, 26 Oct 2024 07:48:59 +0000 https://aecmag.com/?p=21829 Autodesk has fleshed out some of the details of its AI strategy, but there’s still a long journey ahead

The post Autodesk charts its AI future appeared first on AEC Magazine.

]]>
If 2023 was the year that Autodesk announced its ambitions for AI, 2024 was when it fleshed out some of the details. But, as Greg Corke reports, there’s still a long journey ahead

The Autodesk AI brand debuted in Las Vegas last year at Autodesk University, but the launch lacked any real substance. Despite a flashy logo there were no significant new AI capabilities to back it up. The event seemed more like a signal of Autodesk’s intent to add greater focus on AI in the future — building on its past achievements. It came at a time where ‘AI-anything’ was increasing share valuations of listed companies.

Fast forward 12 months and at Autodesk University 2024 in San Diego the company delivered more clarity on its evolving AI strategy — on stage and behind the scenes in press briefings. Autodesk also introduced a sprinkling of new AI features with many focused on modelling productivity, signalling that progress is being made. However, most of these were for manufacturing with little to excite customers in Architecture, Engineering and Construction (AEC), other than what had already been announced for Forma.

In his keynote, CEO Andrew Anagnost took a cautious tone, warning that it’s still early days for AI despite the growing hype from the broader tech industry.

Anagnost set the scene for the future. “We’re looking at how you work. We’re finding the bottlenecks. We’re getting the right data flowing to the right places, so that you can see past the hype to where there’s hope, so that you can see productivity rather than promises, so that you can see AI that solves the practical, the simple, and dare I even say, the boring things that get in your way and hold back you and your team’s productivity.”

One of those ‘boring things’ is sketch constraints, which govern a sketch’s shape and geometric properties in parametric 3D CAD software like Autodesk Fusion, which is used for product design and manufacturing.

Fusion’s new AI-powered sketch auto-constrain feature streamlines this process by analysing sketches to detect intended spatial relationships between aspects of the design.

Automatically constraining sketches is just the starting point in Autodesk’s broader vision to use AI to optimise and automate 3D modelling workflows. As Anagnost indicated, the company is exploring how AI models can be taught to understand deeper elements of 3D models, including features, constraints, and joints.

At AU, no reference was made to similar modelling productivity tools being developed for Autodesk’s AEC products, including Forma. However, Amy Bunszel, executive VP, AEC at Autodesk, told AEC Magazine that the AEC team will learn from what happens in Fusion.

Another ‘boring’ task ripe for automation is the production of drawings. This labour-intensive process is currently a hot topic across the CAD sector (read this AEC Magazine article).

This capability is also coming first to Autodesk’s product design and manufacturing product. With Drawing Automation for Fusion, Autodesk is using AI to automate the process, down to the precise placement of annotations.

With the click of a button, the AI examines 3D models and does the time-consuming work of generating the 2D drawings and dimensions required to manufacture parts. The technology has evolved since its initial release earlier this year and now accelerates and streamlines this process even more by laying out drawing sheets for each component in a model and applying a style. Early next year, the technology will be able to recognise standard components like fasteners, remove them from drawing sets, and automatically add them to the bill of materials for purchase.

Once again, this feature will first appear in Fusion, but sources have confirmed plans to extend automated drawing capabilities to Revit—a significant development given the BIM tool’s widespread use for documentation. There’s also potential for autonomous drawings in Forma, although Forma would first need the ability to generate drawings. During the AU press conference, CEO Andrew Anagnost hinted that drawing capabilities might be in Forma’s future, which, if realised, could potentally impact how much customers rely on Revit as a documentation tool in the long term.


AutoConstrain in Fusion Automated Sketching helps maintain a designer’s intent throughout project iterations by detecting and suggesting dimensional constraints between aspects of a design
Drawing Automation automates the time-consuming process of creating 2D drawings from 3D models. Here seen in Fusion but there are also plans for Revit

Both of Autodesk’s new AI-powered features are designed to automate complex, repetitive, and error-prone processes, significantly reducing the time that skilled designers spend on manual tasks. This allows them to focus on more critical, high-value activities. But, as Anagnost explained, Autodesk is also exploring how AI can be used to fundamentally change the way people work.

One approach is to enhance the creative process and Form Explorer is a new automotive-focused generative AI tool for Autodesk Alias, designed to bridge the gap between 2D ideation and traditional 3D design. It learns from a company’s historical 3D designs, then applies that unique styling language.

Lessons learned from Form Explorer are also helping Autodesk augment and accelerate creativity in other areas of conceptual design.

Project Bernini is an experimental proof-of-concept research project that uses generative AI to quickly generate 3D models from a variety of inputs including a single 2D image, multiple images showing different views of an object, point clouds, voxels, and text. The generated models are designed to be ‘functionally correct’, so a pitcher, for example, will be empty inside. As the emphasis is on the geometry, Bernini does not apply colours and textures to the model.

Project Bernini is not designed to replace manual 3D modelling. “Bernini is the thing that helps you get to that first stage really quickly,” said Mike Haley, senior VP of research at Autodesk. “Nobody likes the blank canvas.”

Project Bernini is industry agnostic and is being used to explore practical applications for manufacturing, AEC and media and entertainment. At AU the emphasis was on manufacturing, however, where one of the ultimate aims is to learn how to produce precise geometry that can be converted into editable geometry in Fusion.

However, there’s a long way to go before this is a practical reality. There is currently no established workflow, plus Bernini has been trained on a limited set of licensed public data that cannot be used commercially.


Project Bernini
Project Bernini is designed to generate models that are ‘functionally correct’, so a pitcher, for example, will be empty inside

AI for AEC

Autodesk is also working on several AI technologies specific to AEC. Nicolas Mangon, VP, AEC industry strategy at Autodesk, gave a brief glimpse of an outcome-based BIM research project which he described as Project Bernini for AECO.

He showed how AI could be used to help design buildings made from panellised wood systems, by training it on past BIM projects stored in Autodesk Docs. “[It] will leverage knowledge graphs to build a dataset of patterns of relationship between compatible building components, which we then use to develop an auto complete system that predicts new component configurations based on what it learned from past projects,” he said.

Mangon showed how the system suggests multiple options to complete the model driven by outcomes such as construction costs, fabrication time and carbon footprint. This, he said, ensures that when the system proposes the best options, the results are not only constructible, but also align with sustainability, time and cost targets.

Another AEC focused AI tool, currently in beta, is Embodied Carbon Analysis in Autodesk Forma, which is designed to give rapid site-specific environmental design insights. “It lets you quickly see the embodied carbon impact at the earliest conceptual design phase, giving you the power to make changes when the cost is low,” said Bunszel.

The software uses EHDD’s C.Scale API which applies machine learning models based on real data from thousands of buildings. The technology helps designers balance trade-offs between embodied carbon, sun access, sellable area, and outdoor comfort etc.

Embodied Carbon Analysis in Autodesk Forma follows other AI-powered features within the software. With ‘Rapid Noise Analysis’ and ‘Rapid Wind Analysis’, for example, Forma uses machine learning to predict ground noise and wind conditions in real time.

Autodesk AI is also providing insights in hydraulic modelling through Autodesk InfoDrainage, as Bunszel explained, “You can place a pond or swale on your site and quickly see the impact on overland flows and the surrounding flood map.”


Embodied Carbon Analysis in Autodesk Forma
Embodied Carbon Analysis in Autodesk Forma, which is designed to give rapid site-specific environmental design insights

Simple AI

Autodesk is also diving into the world of general purpose AI through the application of Large Language Models (LLMs). With Autodesk Assistant, customers can use natural language to ask questions about products and workflows.

Autodesk Assistant has been available on Autodesk’s website for some time and is now being rolled out gradually inside Autodesk products.

“The important thing about the system, is it’s going to be context-aware, so it’s understanding what you’re working on, what project you’re on, what data you’ve run, maybe what you’ve done before, where you are within your project, that kind of thing,” said Haley.

With the beta release of Autodesk Assistant in Autodesk Construction Cloud, for example, users can explore their specification documents through natural language queries, as Bunszel explained, “You can ask the assistant using normal everyday language to answer questions, generate lists or create project updates,” she said, adding that it gives you access to intuitive details from your specifications that usually require lots of clicking or page turning or highlighting to find.


Autodesk Assistant in Autodesk Construction Cloud

Getting connected

Like most software developers Autodesk is harnessing the power of LLMs or vision models, such as ChatGPT and Gemini. “We can use them, we can adapt them, we can fine tune them to our customers’ data and workflows,” said Haley, citing the example of Autodesk Assistant.

But, as Haley explained, language and vision models don’t have any sense of the physical world, so Autodesk is focusing much of its research on developing a family of foundation models that will eventually deliver CAD geometry with ‘high accuracy and precision’.

Autodesk’s foundation models are being trained to understand geometry, shape, form, materials, as well as how things are coupled together and how things are assembled.

“Then you also get into the physical reasoning,” added Haley. “How does something behave? How does it move? What’s the mechanics of it? How does a fluid flow over the surface? What’s the electromechanical properties of something?”

According to Anagnost, the ultimate goal for Autodesk is to get all these foundation models talking together, but until this happens, you can’t change the paradigm.

“Bernini will understand the sketch to create the initial geometry, but another model might understand how to turn that geometry into a 3D model that actually can be evolved and changed in the future,” he said. “One might bring modelling intelligence to the table, one might bring shape intelligence to the table, and one might be sketch driven, the other one might be sketch aware.”

To provide some context for AEC, Autodesk CTO Raji Arasu said, “In the future, these models can even help you generate multiple levels of detail of a building.”

AI model training

Model training is a fundamental part of AI, and Anagnost made the point that data must be separated from methods, “You have to teach the computer to speak a certain language,” he said. “We’re creating training methods that understand 3D geometry in a deep way. Those training methods are data independent.”

With Project Bernini Autodesk is licensing public data to essentially create a prototype for the future. “We use the licence data to show people what’s possible,” said Anagnost.

For Bernini, Autodesk claims to have used the largest set of 3D training data ever assembled, comprising 10 million examples, but the generated forms that were demonstrated — a vase, a chair, a spoon, a shoe, and a pair of glasses — were still primitive. As Tonya Custis, senior director AI Research, admitted there simply isn’t enough 3D data anywhere to build the scale of model required, highlighting that the really good large language and image models are trained on the entire internet.

“It’s very hard to get data at scale that very explicitly ties inputs to outputs,” she said. “If you have a billion cat pictures on the internet that’s pretty easy to get that data.”

The billion-dollar question is where will Autodesk get its training data from? At AU last year, several customers expressed concern about how their data might be used by Autodesk for AI training.

This was a hot topic again this year and in the press conference Anagnost provided more clarity. He told journalists that for a generative AI technology like Bernini, where there’s a real possibility it could impact on intellectual property, customers will need to opt in.

But that’s not the case for so-called ‘classic productivity’ AI features like sketch auto-constrain or automated drawings, “No one has intellectual property on how to constrain a sketch,” said Anagnost. “[For] that we just train away.”

This point was echoed by Hooper in relation to automated drawings, “Leveraging information that we have in Fusion about how people actually annotate drawings is not leveraging people’s core IP,” he said.

To help bring more transparency to how Autodesk is using customer data for training its AI models, Autodesk has created a series of Autodesk AI transparency cards which will be made available for each new AI feature. “These labels will provide you a clear overview of how each AI feature is built, the data that is being used, and the benefits that the feature offers,” said Arasu.


Of course, some firms will not want to share their data under any circumstances. Anagnost believes that this may lead to a bifurcated business model with customers, where Autodesk builds some foundational intelligence into its models and then licenses them to individual customers so they can be fine-tuned with private data.

AI compute

AI requires substantial processing power to function efficiently, particularly when it comes to training. With Autodesk AI, everything is currently being done in the cloud. This can be expensive but, as Anagnost boasted: “We have negotiating power with AWS that no customer would have with AWS.”

Relying on the cloud means that in order to use features in Fusion like auto constraints or drawing automation, you must be connected to the Internet.

This might not be the case forever, however. Arasu told AEC Magazine that AI inferencing [the process of using a trained AI model to make predictions or decisions based on new data] could go local. She noted some of Autodesk’s customers have powerful workstations on their desktops, implying that by using the cloud for compute would mean a waste of their own resources.

All about the data

It goes without saying that data is a critical component of Autodesk’s AI strategy, particularly when it comes to what Autodesk calls outcome-based BIM, as Mangon explained, “Making your data from our existing products available to the Forma Industry Cloud will create a rich data model that powers an AI-driven approach centred on project outcomes, so you can optimise decisions about sustainability, cost, construction time and even asset performance at the forefront of the project.”

To fully participate in Autodesk’s AI future, customers will need to get their data into the cloud-based common data environment, Autodesk Docs, which some customers are reluctant to do, for fear of being locked in with limited data access only through APIs.

Autodesk Docs can be used to manage data from AutoCAD, Revit, Tandem, Civil 3D, Autodesk Workshop XR, with upcoming support for Forma. It also integrates with third-party applications including Rhino, Grasshopper, Microsoft Power BI and soon Tekla Structures.

The starting point for all of this is files but, over time, with the Autodesk AEC Data Model API, some of this data will become granular. The AEC Data Model API enables the break-up of monolithic files, such as Revit RVT and AutoCAD DWG, into ‘granular object data’ that can be managed at a sub-file level.

“With the AEC Data Model API, you can glimpse into the future where data is not just an output, but a resource,” said Sasha Crotty, Sr. Director, AEC Data, Autodesk. “We are taking the information embedded in your Revit models and making it more accessible, empowering you to extract precisely the data you need without having to dive back into the model each time you need it.”

Crotty gave the example of US firm Avixi, which is using the API to extract Revit data and gain valuable insights through Power BI dashboards.

When the AEC Data Model API launched in June, it allowed the querying of key element properties from Revit RVT files. Autodesk is now starting to granularize the geometry, and at AU it announced it was making Revit geometric data available in a new private beta. For more on the AEC Data Model API read this AEC Magazine article.

Autodesk Docs is also being used to feed data into Forma Board, a digital whiteboard and collaboration tool that allows project stakeholders to present and discuss concepts.

“Forma Board lets you pull in visuals from Forma and other Autodesk products through Docs, and now you can demonstrate the impact of sun or noise, ask for feedback on specific concepts, and much more,” said Bunszel.


Forma Board is a digital whiteboard and collaboration tool that allows project stakeholders to present and discuss concepts

Revit also got some airtime, but the news was a little underwhelming. Bunszel shared her favourite Revit 2025 update – the ability to export to PDF in the background without stopping your work. Meanwhile, manufacturing customers were being shown the future, with new features coming to Inventor 2026 such as associative assembly mirror and part simplification.

In the press conference Anagnost reiterated how Forma is different to Revit. “It is driven by outcomes,” he said. “We not trying to redo Revit in the cloud.”

Anagnost added that Forma is going to start moving downstream into things that Revit ‘classically does well. “It doesn’t mean it has to swallow all of Revit, and you know that would take a long time, but it can certainly do things that that Revit does today as it expands,” he said.


An iterative future

Autodesk is beginning to add clarity to its AI strategy. It is addressing AI from two angles: bottom up, bringing automation to repetitive and error prone tasks, and top down with technologies like Project Bernini that in the future could fundamentally change the way designers and engineers work. The two will eventually meet in the middle.

Autodesk is keen to use AI to deliver practical solutions and the automation of drawings and constraints in Fusion should deliver real value to many firms right now, freeing up skilled engineers at a time when they are in short supply.

We expect automated drawings will find their way into Autodesk AEC products soon, but it’s hard to tell if Autodesk has any concrete plans to use AI for modelling productivity.

As to pushing data into Autodesk Docs to get the maximum benefit out of AI, the fear that some customers have of getting trapped in the cloud is unlikely to go away any time soon.

Meanwhile, it’s clear there’s still a long way to go before the AI foundation models being explored in Project Bernini can deliver CAD geometry with ‘high accuracy and precision’.

While Bernini is starting to understand how to create basic geometry, the 3D models need more detail, and Autodesk must also work out how they can be of practical use inside CAD. With rapid advances in text-to-image AI, one also wonders what additional value text-to-CAD might bring to concept design. One could also ask whether product designers, architects or engineers would even want to use something like this to kickstart their design process. As the technology is still so embryonic it’s very hard to tell. It’s also important to remember that Bernini is a proof-of-concept, designed to explore what’s possible, rather than a practical application.

Meanwhile, as Autodesk continues to develop the complex AI training methods, there’s also the challenge of sourcing data for training. It will be interesting to see how Autodesk’s trust relationship with customers plays out.

While Autodesk’s long-term plan is to get multiple foundation models to talk together, this doesn’t mean we are heading for true design automation any time soon.

At AU Anagnost admitted that the day where AI can automatically deliver final outcomes from an initial specification is further away than one might think. “For those of you who are trying to produce an epic work of literature with ChatGPT, you know you have to do it iteratively,” he said. That same iterative process will apply to AI for design for some time to come.

The post Autodesk charts its AI future appeared first on AEC Magazine.

]]>
https://aecmag.com/ai/autodesk-charts-its-ai-future/feed/ 0
Autodesk Content Catalog https://aecmag.com/data-management/autodesk-content-catalog/ https://aecmag.com/data-management/autodesk-content-catalog/#disqus_thread Wed, 18 Sep 2024 13:58:23 +0000 https://aecmag.com/?p=21461 Autodesk has integrated Unifi’s solution for managing and accessing design content into its cloud stack

The post Autodesk Content Catalog appeared first on AEC Magazine.

]]>
Eighteen months on from its acquisition by Autodesk, Unifi’s cloud-based software solution for managing and accessing design content has been reworked and integrated into Autodesk’s cloud stack

For mature BIM customers, having content at the tips of their fingers can lend productivity a major boost. It might be content – created for previous projects that they need, or new content created for ongoing projects that needs to be shared among teams. Managing this kind of content can be complicated, however, and there have been numerous technologically-based attempts to tackle the issue.

The big issue here is that the Internet was, and still is, a Wild West when it comes to downloadable content. Customers looking for BIM component data that doesn’t already exist in their own internal, managed repositories are forced to deal with issues around file size and quality and then incorporate these ‘foreign objects’ into well-managed BIM processes.

It’s been a challenge for the software industry. Take, for example, Autodesk Seek, a content website from the software giant that demonstrated exactly how disparate the quality of downloadable content can be. In early 2017, Autodesk ended up handing over the operations and customer support obligations relating to Autodesk Seek to BIMobject , a Sweden-based company that has taken on the gnarly task of encouraging AEC manufacturers to provide managed content and developed a high-end database to store it, at first for its own use and then later, for fee-paying BIM customers. In the UK, meanwhile, we have BIMstore, among many others.

In my view, getting companies in the AEC industry to provide up-to-date, high-quality, modern digital deliverables that represent their entire product ranges is probably never going to happen. The task is too huge, and I think we may have to wait for AI to take it on.

Welcome news

That said, the recent announcement that Autodesk has reworked the technology acquired in its March 2023 acquisition of Unifi and integrated it into its own cloud stack is welcome news.

Unifi was founded in Last Vegas in 2010 by engineers Dwayne Miller and Ken Gardener, in response to the huge expansion of building programmes in that city over the past two decades, as well as in the Middle East and Asia. The goal was to give back to BIM users all the time they wasted searching for and downloading content.

In essence, Unifi was built as a library for collaging and managing company and project content, providing control of virtual assets for firms looking to deploy consistent standards across project teams. Offering cloud accessibility, in-Revit access, intelligent search and browse and a stack of other management tools, Unifi gained real momentum quickly and inevitably popped up early on Autodesk’s radar. Getting into buying mode and negotiating a deal took time, but it was clear that Unifi was a product from which all Autodesk customers could benefit and which could easily be included in their subscription fees.


Find this article plus many more in the Sept / Oct 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Since the deal was signed, Unifi has been reworked to fit into Autodesk’s cloud stack and rebranded as Content Catalog, a part of the Autodesk Docs subscription at no extra cost and manageable via the Autodesk Construction Cloud Admin. This means that users of numerous Autodesk products have free access to Content Catalog (including ACC, AEC Collection, BIM Collaborate, Collaborate Pro, Autodesk Build, Autodesk Takeoff), as well as those with an Autodesk Docs stand-alone subscription.

Meanwhile, customers using the most recent release of Unifi Pro are secure and Autodesk has no plans to retire this product. In fact, Content Catalog doesn’t offer the full functionality of Unifi Pro, with a number of key features omitted. These include content ratings, content requests, personal saved searches and support for Revit legends, Revit Material and Fill Pattern. Also missing are the preview image generator, automatic users management for group syncing with SSO or Active Directory, the ability to create new shared libraries, manufacturer-provided content (channels), shared parameter management, APIs and Project Analytics.

It’s expected that many of these capabilities will be added over time. Project Analytics, for example, appears to be something that Autodesk is working on in a more general capacity within its cloud stack and the company plans a release of new management tools, with a separate licensing framework.

In conclusion, Unifi has the potential to be a big crowd-pleaser, especially with customers that have not yet implemented a content management system of their own. And in many ways, content management within Autodesk’s BIM products is long overdue, as are industrial strength management-level tools. The inclusion of Content Catalog in the Autodesk stack and the possibility that the company is working on additional management reporting tools is likely to be well-received.


Autodesk Content Catalog

The post Autodesk Content Catalog appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/autodesk-content-catalog/feed/ 0
Quantum Group selects Zutec for construction management https://aecmag.com/project-management/quantum-group-selects-zutec-for-construction-management/ https://aecmag.com/project-management/quantum-group-selects-zutec-for-construction-management/#disqus_thread Thu, 18 Jul 2024 10:00:27 +0000 https://aecmag.com/?p=21013 Single platform will help Ireland-based property developer digitise more construction and quality processes

The post Quantum Group selects Zutec for construction management appeared first on AEC Magazine.

]]>
Single platform will help Ireland-based property developer digitise more construction and quality processes

Quantum Group, an Ireland-based property developer, has selected Zutec to manage its construction project data from a single platform.

By digitising building information and construction documents, Quantum will use Zutec’s document management system for planning, design, tenders, procurement, and plot tracking, including the ability to approve drawings for future developments and resolve issues on site as they arise.

“In a market where quality cannot be overlooked, we required a platform to differentiate ourselves from others and create a framework for quality-driven processes,” said Patrick Shaughnessy, construction director at Quantum Group.

“Zutec fitted the bill in terms of an easy-to-use platform that provides solutions, features and functionality that gives us more control over how we manage documents and information related to construction and quality for all our projects – all from one place.

“This will help us better manage site teams, site progress, suppliers, and sub-contractors, and ultimately raise the standards of the quality and innovation across our developments.”

A suite of integrated quality assurance and health & safety forms, checklists and inspections will all be delivered to Quantum by Zutec to help ensure data is digitised and can be easily collected in a consistent way and housing and apartments are built to the company’s high standards.

This will include the ability to upload photos to evidence work done and a snagging register so data can be reviewed, and problems resolved ahead of handover. Data related to safety inspections can be captured on-site in real-time then easily shared with teams to provide safety visibility and mitigate risk.

Zutec’s handover management will also help Quantum effectively manage projects to completion by bringing together O&M (operations & maintenance) manuals, fire evacuation files (FEF) and health and safety files (H&S) in one place to meet asset owner and regulatory obligations.

All data can be captured in the field by site teams and subcontractors using the Zutec Field app from any device with or without a Wi-Fi connection. Information is then synced in the Zutec cloud, when a device is online, and uploaded into the Zutec dashboard for reporting and analytics, but also for easy information, site progress and compliance.

“As developers and housebuilders look to digitise more construction and quality processes, our aim is to support them with the best solutions that help drive structure and standardisation in data and documentation across their business,” said James Cannon, Chief Revenue Officer at Zutec.

“Before now, Quantum didn’t have a digital system in place to manage information during the construction stages and relied on manual and paper-based processes. With Zutec they can have all their information in the cloud and workflows in place to ensure the right people have the right information at the right time, empowering teams to deliver builds more efficiently and to higher-quality standards, while giving site managers complete control and confidence over works completed. One solution for everything.”

The post Quantum Group selects Zutec for construction management appeared first on AEC Magazine.

]]>
https://aecmag.com/project-management/quantum-group-selects-zutec-for-construction-management/feed/ 0
Autodesk’s granular data strategy https://aecmag.com/data-management/autodesks-granular-data-strategy/ https://aecmag.com/data-management/autodesks-granular-data-strategy/#disqus_thread Thu, 25 Jul 2024 06:00:00 +0000 https://aecmag.com/?p=21061 The AEC world is still file based, but the future is databases in the cloud. We look at how Autodesk is addressing this shift

The post Autodesk’s granular data strategy appeared first on AEC Magazine.

]]>
Autodesk’s new AEC data model API marks the beginning of the transition from monolithic files, such as Revit RVT, to granular object data to open up new opportunities for sharing information and greater insight into projects. Martyn Day explores what this might mean for AEC firms moving forward

In AEC software, the last forty years have been about buying branded tools and creating associated proprietary files. Compatibility has been achieved by basically buying the same software as your supply chain and sharing the same proprietary files.

The next forty years will be all about the data: where it’s stored and how teams can access it. Monolithic desktop solutions will be replaced with discrete cloud-based services. These will provide different ‘snapshots’ of project data (different disciplines, project activities, meta data), which will be accessed through seamless Application Programming Interfaces (APIs).

The advantages will be no longer needing to send data around in big lumps, caching huge files, or cutting models up. Being centrally sourced, collaboration can be built-in from the ground level and granular data opens new data sharing opportunities, with greater insight into projects.

The key issue for the main software industry players, and their customers, is how do they get there? For companies like Autodesk, this is a significant challenge. It’s the market leader by volume, so has a lot to lose if it gets it wrong.

The company is currently developing Forma as its next generation cloud-based platform for AEC. Today, it may look like just another conceptual tool but there is a lot of engineering work taking place ‘under the hood’. In June, the day before AEC Magazine’s NXT BLD event, Autodesk announced the first downpayment on opening up the RVT file to new levels of granularity with the general availability of its AEC data model API. (cont…)


Find this article plus many more in the July / August 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The API is still in development and, for now, it only accesses metadata, without geometry, but this can be used to build dashboards or access design information that can be tabulated. Geometry will be the next layer of capability added to the API. At the time of release, Autodesk stated: “Through the AEC data model, we look to deliver a platform that prioritises a transparent and common language of describing AEC data, enabling real time access to this data and ensuring that the right data is available to the right people at the right time.”

 

Autodesk API
How Autodesk introduces the AEC data model and API. Slides taken from Sasha Crotty & Virginia Senf’s ‘A new future for AEC data’ presentation at NXT BLD 2024. Watch the presentation

Autodesk API

Autodesk API


Over time, Autodesk will continue to build this capability out, allowing developers to read, write, and extend subsets of models through cloud-based workflows via a single interface. There will not be a need to write custom plug-ins for individual desktop authoring applications like Civil 3D, Revit, Plant3D and other AEC connected design applications. The filebased products will store their designs as files on Autodesk Docs and will be, on demand, ‘granularised’ to meet customers’ requirements. However, in order to make AEC data more accessible / interoperable everything must be restructured (converted) enabling the data to be remapped and connected across AEC disciplines. The company sees that the key delivery of the AEC data model API technology will lead to enhanced support for iterative and distributed workflows.

AEC data model API capabilities

As mentioned earlier, this is just an initial instalment of granular capability. The API allows the querying of key element properties from ‘published’ Revit 2024 and above version models – published meaning that the files are stored on Autodesk Docs.

The AEC data model API exposes these properties through a simple to use GraphQL interface, tailored to the AEC industry. GraphQL is an open-source data query and manipulation language for APIs and a query runtime engine.

Using GraphQL users can access Autodesk Construction Cloud (ACC) accounts, projects and designs and retrieve element and parameter values. It’s possible to retrieve different versions of a design and query for elements at a specific design version. Users can search for elements within a design or across designs within a project or hub using specified search criteria. It’s possible to list all property definitions and query elements based on their properties such as categories (doors, windows, pipes, etc.) or parameter name + Value (area, volume, etc.), materials.

Autodesk expects customers and developers to automate workflows, such as identifying anomalies within designs (quality checking) and locating missing information, comparing differences between designs. It’s possible to generate quantity take offs, schedules, build dashboards and generate reports.

For now, data granulation and viewing the results is free but there are rate limits, based on a point system. Overall, users are allowed 6,000 ‘points’ per minute. Each individual request is limited to 1,000 points per query. If you exceed these requests, Autodesk’s servers will not send you the information. The points system varies on the function – a query return is rated at 10 points, Object info in at 1. There is more information about his online.


Watch Autodesk’s NXT BLD 2024 presentation

For more information watch the NXT BLD 2024 presentation from Autodesk’s data experts, Virginia Senf and Sasha Crotty.

The talk includes details about the acquisition of Datum360, which aggregates and connects multiple data sources together and applies industry standards with compliance reporting.



Open or walled garden?

In recent times, Autodesk has been making a lot of noise about being open and being a better open citizen. It has licensed the Industry Foundation Classes (IFC) tool kit from the Open Design Alliance (ODA), the most used IFC creation tools, it has created export services on the cloud to translate between different applications, and signed multiple interoperability deals with competitors, such as PTC, Bentley Systems, Trimble and, most recently, Nemetschek.

But why are interoperability agreements required in the first place? If you are open, you are open and surely permission would not be required? A lot of these agreements are about swapping software, perhaps file format libraries but increasingly it’s about rights to have API access.

This is the big issue for cloud. If you move your data to the cloud, where it might be translated into a proprietary database format, the only ways to get access to your data are via file export or API calls.


Autodesk API

The path to granularity

The AEC data model API enables the break-up of monolithic files, such as Revit RVTs and AutoCAD DWGs, into ‘granular object data’ that is managed at a sub-file level in Autodesk Docs on the company’s secured cloud. This data is accessible in real time via Autodesk’s APIs and enable new capabilities.

Today, while it would be great to get RVT, DGN, DWGs out with great file compatibility, in five to ten years’ time, this will be as exciting as getting a DXF. Project information for all disciplines at high levels of granularity will deliver greater collective benefits than relatively dumb files. API access really is at the control of each company and allows firms to wall off their customers’ data to selected developers.

Autodesk’s cloud-based API, which was called Forge but is now Autodesk Platform Services (APS) comes with terms of usage, once of which, 5.3 simply states, ‘No use by competitors – Except with Autodesk’s prior written consent, you may not access or use the services if you are a competitor of Autodesk.’

For me this seems to be the main reason for the agreements, to give express permission to access customers’ data via APS. But this isn’t given to all: it has to be negotiated and I assume it’s on a quid pro quo basis and probably on how much of a threat you are. This is a kind of openness, but it’s going to be highly conditional and could be revoked at any time.

If you move your data to the cloud, where it ‘‘ might be translated into a proprietary database format, the only ways to get access to your data are via file export or API calls

As all software moves to the cloud the API world is also trying to work out ways of financially rewarding the software firms. It is inevitable that API calls will be charged for. All this software is sat on Amazon AWS or Microsoft Azure instances and all traffic comes with associated micropayments. Software firms are examining models to cover these charges while adding their profit margin. In the case of Autodesk these are wrapped into the cloud credit system, covering functionality and the AWS bill. While the AEC data model API is currently free, it is throttled with an associated point system. Metering is an important metric for future business models.

Conclusion

Changing the fundamental technology on which your applications and customers have built businesses on is not for the faint hearted. Keeping desktop software sales alive, while re-engineering filebased workflows to ones that are granular is like changing a car tyre at 90 miles an hour. With the release of this AEC data model API, we now have some insight as to how Autodesk will engineer the data model component.

For now, you can use all Autodesk cloud services as you currently use them. Instead of forcing a translation between every Revit file save on the fly, the granularity of files is handled on demand, for those that want to take advantage of it. The API, now and seemingly for quite a while to come, is all about output from Autodesk Docs (viewing, tabulating, querying) as opposed to doing something to the data and sending it back to Autodesk’s servers. This, I guess, is mainly about security.

Being able to view granular info via a simple interface is obviously a huge advantage over file-based workflows, where a lot of the BIM metadata goes to die. In some respects, this is ‘an openness’ here, enabling the viewing of design data held in proprietary files, held in a proprietary database on a paid for service. The fact is you still need to be an ACC subscriber to generate the granular data in the first place.

The most interesting development will be when geometry can also be output with the meta data through the Autodesk AEC data model API. Will the geometry be Universal Scene Description (USD), or will it be like IFC.JS, with all the granular object data?

With IFC.js and open-source products like Speckle, it’s already possible to make data granular right out of the Revit desktop app, without having to pay to use Autodesk’s cloud services. The battle for granular data mobility has started.

The post Autodesk’s granular data strategy appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/autodesks-granular-data-strategy/feed/ 0
Atvero Mail AEC email management tool to launch https://aecmag.com/data-management/atvero-mail-aec-email-management-tool-to-launch/ https://aecmag.com/data-management/atvero-mail-aec-email-management-tool-to-launch/#disqus_thread Thu, 04 Jul 2024 12:21:09 +0000 https://aecmag.com/?p=20866 Designed to bring order to email, which remains a key method of transmitting project information, documents, and drawings

The post Atvero Mail AEC email management tool to launch appeared first on AEC Magazine.

]]>
Atvero Mail designed to bring order to email, which remains a key method of transmitting project information, documents, and drawings

CMap has introduced Atvero Mail, an email management tool specifically designed for the AEC sector. The software is built on Microsoft Outlook and offers automated email filing, and ‘powerful search functionality’ to ‘instantly find’ emails and attachments for any project.

“With email remaining a key communication tool when working on projects, there was a market need for an email management product that emphasized discoverability and searchability for AEC firms, whilst keeping users in their familiar Microsoft 365 environment. This is exactly why we’re developing Atvero Mail,” said Marcus Roberts, head of Atvero.

CMap acquired Atvero in February 2023 as a document, drawing and email management solution.

After extensive engagement with the AEC community, the decision was made to split Atvero into two products; Atvero Mail (launching late Q3 2024) and Atvero PIM (which will continue to operate as a document, drawing and email management solution).

“Since acquiring Atvero we’ve seen enormous market demand, with new legislation, such as the Building Safety Act, changing the approach AEC firms have toward their information management practices,” said Dave Graham, CEO of CMap.

“During this time, working closely with our customers and the AEC community we’ve identified the vital need for a standalone email management tool to provide people with an easier way to get started on their information management journey.”

Atvero Mail will be available in late Q3 2024.

The post Atvero Mail AEC email management tool to launch appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/atvero-mail-aec-email-management-tool-to-launch/feed/ 0
Cyberattacks: safeguarding contractors https://aecmag.com/data-management/cyberattacks-safeguarding-contractors/ https://aecmag.com/data-management/cyberattacks-safeguarding-contractors/#disqus_thread Wed, 22 May 2024 11:19:38 +0000 https://aecmag.com/?p=20549 Ben Wallbank, Trimble, shares some best practices to mitigate cyberattacks

The post Cyberattacks: safeguarding contractors appeared first on AEC Magazine.

]]>
It’s every construction firm’s biggest nightmare: criminals taking control of their data and holding them to ransom. Ben Wallbank, Trimble, shares some best practices to mitigate cyberattacks

Cybersecurity and cybercrime often conjure up images of hackers in dark hoodies, sneaking in the digital back door. In reality, nearly 90% of corporate cybercrime, such as phishing or ransomware attacks, is a result of employee error.

The UK construction industry is no exception and could be an even greater target than other industries. Protecting massive amounts of data, including warranty and latent defect remediation periods, makes contractors attractive to cyber criminals. Cybersecurity is so crucial to construction that the National Cyber Security Centre produced a construction industry-specific guide, along with the Chartered Institute of Building (CIOB).

Cybercriminals who target the construction industry usually do so by accessing, copying and sharing data illegally or by installing malware on a company’s computers and network, taking control of files and holding them for ransom. It’s called ransomware, and it’s probably the most common and one of the most debilitating types of cybersecurity breaches in the construction world.

Each year, we hear of new cyberattacks, taking critical infrastructure offline and crippling construction businesses worldwide, including many here in Europe. These attacks cost billions of pounds a year and can cause whole cities, businesses and services to grind to a halt.

UK contractors should follow these best practices to safeguard against cyberattacks and improve outcomes in case of an attack.

Create a business continuity plan

Preparing for the worst puts your business in the best position moving forward because you can act quickly and have more control of the outcome. A solid cyber security disaster plan can get quite detailed. It should be consistently reviewed, practised and updated to net the best results in case of an incident. At a minimum, a business continuity plan should include the following:

  • Name of a leader to act as a central resource to manage disaster recovery across multiple departments.
  • A communication plan for sharing key messages and managing crises with employees, clients and additional project stakeholders.
  • A maintenance plan for a continually updated (and backed up) list of employee contact information and asset inventory.

Backup all data

A crucial aspect of any good cyber security plan is to make sure that everything is backed up, preferably on the cloud or physically on an offsite server that’s not on your network. Backups should be frequent and automated, so ask your IT provider to set them up so that they either happen in real time (if you’re backing up to the cloud) or that they run daily after everyone has left the office.

Secure mobile devices

Mobile devices are more challenging to secure than other data systems, but just as critical. Utilising an enterprise management platform, such as Cisco Meraki, allows you to maintain enterprise-level control over all of your devices. These kinds of platforms ensure that individual devices are still managed centrally, and contractors can limit software installation, track devices using GPS, disable devices and more.


Find many more articles like this in AEC Magazine
👉 Subscribe FREE here 👈

Protect software and servers

When it comes to software and security risks in construction, contractors should choose platforms and software providers that take security seriously. Granular permissions, user-friendly management systems and multi-factor authentication, for instance, are all must-haves in any construction software.

By using cloud-based, connected construction software, contractors shift the responsibility of maintaining servers, ensuring SOC 2 Type II compliance, and data backup and storage. Project and business data backups happen automatically, providing daily protection, with costs often included or rolled into users’ subscription costs. New software features and security functionality are also rolled out automatically.

By coupling the backups with cybersecurity protections, cloud vendors use the latest technologies to thwart cybercriminals and provide an extra level of protection not otherwise achieved through in-house backups. When shopping for business software, make security one of your first discussion points.

Additionally, your web and email servers need to be properly protected to avoid online attacks. Physical network servers need to be secured, and you need to ensure that any cloud-based solutions you’re using also implement rigorous security protocols.

Assure employee buy-in

Cybersecurity protection in construction requires every employee at every level to be fully engaged and actively vigilant. There are several steps to take to make that happen:

  • Ensure all employees receive regular cybersecurity training, especially if online workflows or procedures change.
  • Welcome feedback from team members and update cybersecurity policies and processes as needed.
  • Counsel employees on everyday things to look for before opening email, like spelling and grammar errors, verifying sender’s email address, and never opening unexpected attachments.

Take the first step: get started

The most important step is the first one. The UK government offers two certifications – Cyber Essentials and Cyber Essentials Plus – that are crash courses in the basics to keep businesses safer from cybercrime. While they don’t replace a cybersecurity risk assessment, they will show you how to do one and how to select the security measures your business needs.

Anywhere your data is stored or used is a potential entry point into your company’s digital existence. It only takes one slip to allow malicious code or ransomware in, and once it’s there, it can cause millions of pounds worth of damage.


About the author
Ben Wallbank is a BIM strategy and partnerships manager for Trimble.

The post Cyberattacks: safeguarding contractors appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/cyberattacks-safeguarding-contractors/feed/ 0