✱ ARE Hacks - My new book

Hi everyone, It's been a while since I've posted. I have a pretty good reason why this site hasn't been updated in a while, and here it is: I've written a book. This site is all about adding value to yourself, and my new project fits right into that idea... it just isn't about digital tools this time (although there are many digital tool recommendations in the resource chapter). It's about getting motivated to pass the Architecture Registration Exam to become a licensed architect. If there's one thing that adds value to your endeavors in a career in architecture, this is it. 

If you'd like to learn more, go over to the ARE Hacks website. There you can see what people are saying about the book and can decide if it's for you. If you like the tutorials and training I've done here at Method, please check it out. There is a paperback and Kindle version available. Also, if you sign up for the newsletter when you're there, I'll send you a free chapter.

It's my mission to make people more valuable to the profession of architecture, and that doesn't stop at this website. I hope to see you there! 

SketchUp News: 3d Basecamp Keynote Video

Here’s this year’s SketchUp 3d Basecamp Keynote video. If you have time, it might be worth it to watch the whole thing and see the 2016 SketchUp state of the union. In my opinion, the best part is at the 29 minute mark where you’ll see some great work by Alex Hogrefe of https://visualizingarchitecture.com/. It's mostly Photoshop workflows, and the end result is eye-catching. This kind of work doesn’t have to use SketchUp of course. It’s more a discussion of image quality and feeling. At the 56 minute mark there is an interesting section on feedback based on the types of images Alex posts online. It’s good insight you might consider for the images you make depending on who your audience is.

Video: Get a workout painting in VR with Google's new Tilt Brush

Building on my last post about VR, here's the latest tech from Google called Tilt Brush. I told you that VR was going to be huge this year, and this is the latest mouth-watering entry into the burgeoning field of virtual reality.

This example is using the HTC Vive system. Users get a 9' cube of blank canvas space to work within. In one hand you have the motion-tracked 'brush' and in the other you have your 3 dimensional painter's 'palette'. The brushes are dynamic, much like using the Apple Pencil or a Wacom tablet with jitter and flow variations, except now in 3d. There are internal and external sensors tracking your every move, essentially constructing all of your brush elements in real time and keeping them in the exact spot you place them as you wave your magic wand-brush. 

This looks incredible. Everyone needs to see this video. Make no mistake – there will be tools like this developed for architecture, and I can't wait.

After you watch the video, head over to the Tilt Brush site and check out their additional videos that highlight the other aspects of the system. 

And... it's a real thing, available now within Steam if you have a HTC Vive and a badass computer that meets their hefty requirements.

 

Tilt Brush lets you paint in 3D space with virtual reality. Unleash your creativity with three-dimensional brush strokes, stars, light, and even fire. Your room is your canvas. Your palette is your imagination. The possibilities are endless. Learn more at http://tiltbrush.com

✱ Get Inside Your Project with VR

2016 is turning out to be a huge year for Virtual Reality (VR). I'm sure you already know this. The stars have aligned – the raw power of our pocket computers and the tools that allow us to create interactive experiences have finally matched up, and this is great news for architects.

This article isn't about the Oculus Rift ($599) or the HTC Vive ($800), both of which are technically more complicated. By that, I mean that they require a headset with wires, an expensive computer, motion trackers and a gaming controller to walk through your building. They are both amazing pieces of hardware, and I'll be watching closely as their development continues. I'm sure the prices will begin to drop as the technology moves forward.

This article is about a simpler, wireless technology that is accessible to far more people. I'm talking about Google Cardboard ($20 or less) and 360 degree spherical stereo panoramic renderings that many of you can already produce with the right software and view in a modern web browser.

Immersive VR Headsets 

Before I get to the Google Cardboard part of this article, I want to quickly talk about the Rift and Vive. I'm really excited about the technology these headsets are promising. They are, for all intents and purposes, fully immersive. The videos I've seen of people using them are fun to watch and imagine myself as a user. This is something I'm pretty comfortable with since I am an architect who makes a living imagining what users feel like in spaces before they are built. You can probably imagine why in turn I'm so excited about the possibility of these new tools. 

Here's an fun example. Turn down your speaker volume if you're in an office because the language has been bleeped out, and it might raise an eyebrow or two:

 

There are some really incredible experiences being developed to fully immerse users into other worlds. Architects have an unprecedented opportunity to jump on this technology and use it to our advantage. I've had high-power clients use these tools and quickly turn back into 6 year old kids. It's really fun and powerful to see them truly experience their buildings in this new way. 

With the release of Rift v2 and Vive, the resolution of the screens has doubled, and the processing power of the CPU's pushing the data to the headsets has increased by a lot. It is now possible to actually feel like you're in a nicely rendered, real time environment. Both sound and motion tracking are part of the system as well. They do require hefty computers with serious graphics processing units (GPU's) however, and along with the price tag, this is why they aren't as accessible to everyone. On top of that, the headsets are connected to the computer via wires that scoop over the back of the users' head, so being tethered to the computer is something else to consider. If we were to give a client a demonstration with the Oculus or Vive, they would need to come to our offices most likely, or we would need to have a substantially expensive powerhouse laptop and a lot of parts to tote along with us when we go to visit them.

Google Cardboard

In contrast, what makes the Google Cardboard so compelling is its simplicity. It runs on devices that practically everyone on the planet has in their pockets and purses already, there aren't any wires, there's no software to install, and it's laughably affordable at less than $20. Sounds too good to be true, right? I know... but it actually works. I have a project I've been working on to prove it, and I want to share it with you. All you need is a Cardboard viewer. Actually, you don't need a viewer. You can look at my project without it, but there are advantages to having a viewer in your possession. It just runs in a web browser – there's no software to install. You already have what you need.

Math & Science Building, STEM Center - Golden West College, Huntington Beach CA
Imagery and VR courtesy of HMC Architects

Once you open the website address to our STEM center project by clicking on this link or by clicking on the image above, you'll see a panoramic interior view of the space. You can use your finger if you're on your phone or tablet, or your mouse if you're on your computer to look around. This is pretty standard panoramic navigation. What's a little different about these panoramas is that there are several nodes linked together via the large white-bordered arrows you see hovering over the floor. If you tap on those when you're looking on your computer or hover the crosshairs in the Cardboard viewer over them for a couple of seconds, you'll be transported to another location within the space. You can actually tour the building by jumping from node to node!

It's magic

Let's take it a step further. There's an icon at the bottom of the screen that looks like a Google Cardboard viewer (where the green checkbox symbol is located in the image above). Tap that, and the screen splits in half displaying the synchronized stereo panoramas. This is where the magic happens. Once you've entered this mode, place your phone into the Cardboard viewer for an amazing 3d experience. With the slight shift of twin renderings about the width of a human's eyes apart, depth is enabled. Stereo panos coupled with the gyro motion tracking sensors in our phones allow us to look around inside our spaces in full 3d as if we are standing inside them. If you do it long enough (for about 5 minutes or so), you'll actually start to believe you're there. It's a mind-bender when you lower the viewer and you're standing in another space than the one you were just experiencing. 

$16.99 $39.99

I can't believe we have computers powerful enough to do all of this with us all the time. This is so exciting!

If you don't have a Google Cardboard viewer yet, you can get them on Amazon for less than $20. A word of warning: not all viewers are created equal. Yes, you can find them for less, but the lenses are crap in the cheap ones, and they are the most important part because they insure you can actually see what you're looking at. I've included a link on the right to a viewer that has great ratings and has nice lenses so you can see the space you go into without blurred vision.

Why should you care about VR?

I'm excited about these immersive environments for two uses. First, for presentations this is a killer way to communicate a design. As I've talked about before, it allows us to put our projects in our client's hands to let them drive. Clients don't want to watch you drive around their project in 3d. They should get to do it themselves. Second, I'm excited to use this as a tool during the design process. I don't need a fully photo-realistic rendering to make spatial decisions. I could easily be modeling space within my 3d program and spit out a clay model of the project so I could quickly get inside and check it out. For this, photorealism isn't necessary like it potentially is for a client presentation. I now have the ability to intimately and immersively experience shade and shadow, volume, light, the way spaces interact, and much more in a very different way than just by seeing it on a screen or on a sheet of paper.

In both instances, the barriers of 2d mediums are no longer a communication hindrance, and projects can be experienced and therefore understood without the need for a presentation. In other words, people just get it, and the work will for the most part be able to speak for itself. I couldn't be more excited to share architecture people in this way.

In a future article, I'll be talking more about the software needed to generate these types of panoramas so that you can do this too. It's a fast-moving technology where things are constantly changing, so I'll keep it as timely as possible so you can give your models a try and experience them for yourself. 

Happy viewing!


Special thanks to Chris Grant and Francisco Penaloza, both good friends and visualization gurus who have contributed to my understanding of this technology in huge ways. It is through their talent and drive that I get to show off our project in these amazing ways.

Link: Astropad

Image courtesy Astropad

Astropad, released today, is a new piece of software that connects the dots between two pieces of technology you might already have - an iPad and a Mac - and allows them to work as one. Our iPads can now be used as so-called professional graphics tablets for $50 ($20 if you're a student). 

The developers of Astropad have solved a problem that has nagged this kind of interoperability for years now which is to cut down on the lag between input and results on the screen by creating something they call Liquid:

Creating Astropad required innovative new technology we call LIQUID. The result is stunning image quality and responsiveness never before seen in similar tools.

LIQUID is true to your source material with color corrected output and retina resolution. What you see on your iPad is the same as on your Mac.

I also love this graphic on their site. Fingers crossed indeed.

Previously using Airplay we got 30 frames per second (FPS) and with Liquid we get 60 FPS. This helps a lot.

I'm hopeful of Astropad because it solves the very real problem I've always had with standard graphics tablets which is the disconnect between the hardware and the screen, and I've never considered myself to be a person that could justify buying a Cintiq. Being able to have direct input between the stylus and the pixels is a big deal, and I'm excited to try this out and see how well it works while keeping in mind it's a v1 product. Will it be as responsive as a Wacom tablet? Nope. But it also doesn't cost $350.

What do you think about it?

(h/t Chris Grant)

 

Update: I've tried Astropad out, and it works surprisingly well. I used it on the Mac version of Notability and Photoshop CS6 connected between my iPad Air and my 27" iMac.

When you download the Mac app and install it, and then install the free iPad app, you go through a simple pairing process to connect the two over wifi or USB. The instructions are very well done. Except... I tried it in my office where I have no control over the wireless system, and it didn't work. Both devices must be on the same wifi network, which they were, but for some reason they couldn't communicate. To be fair, there are hundreds of devices on that particular wifi so I'm not surprised they couldn't find each other. I ended up simply creating a wireless network on the iMac and connecting to it through the iPad. Once that was setup, they found each other right away. Another option is to do a hard wired connection but I don't carry a Lightning cable with me. Plus, I really wanted to test it out over wireless to see how well it would perform over the air. 

In Notability

Notability is a note taking app that has some really nice ink technology and I wanted to see if I could use it with a graphics tablet to do markups (redlines) on PDF's in realtime during a GoToMeeting while sharing my screen. I don't think the developers were really thinking of this particular use either, which was another reason I chose to do it.

Astropad excelled at what it was designed for, which was allowing me to have direct input on my iPad with my stylus and having my work be mirrored on my Mac. There was little to no lag between the two. For a first release, it is very impressive. 

What it did not excel at was giving me Notability's controls on my iPad screen, which again I can't really blame them for. What this means in real world applications is that I had to do a lot of back and forth between my work on the iPad and using the mouse on my computer to switch between tools, brush sizes, and colors. I don't think Notability has keyboard shortcuts for what I need, and that would make things much easier. 

In Photoshop

Astropad works even better in Photoshop, which seems to be the app's initial target market. It's the same great painting interface, you can zoom and pan around your document right on the iPad, and there are common tool shortcut buttons on the iPad's screen for brush sizes and more. These really help make the experience much better because you spend a lot less time switching over to your mouse to get to your most-used tools.

Conclusion

It would be great if more Mac app developers look to have an Astropad connection now that it's out.  Having software shortcut buttons right on the iPad's screen would be welcomed and would lead to more sales for sure.

Never once did the connection fail. Even after long pauses between uses, I could just pick up and continue without re-pairing the devices.  

Probably the biggest challenge is using Astropad between two devices with such different physical screen sizes. I was going between 27" and 11". There is a lot of time spent panning around to see what you want to see on the iPad. I'm sure with more use I would get much better at it. There are three zoom settings that you can access on the Mac side - full screen, 100% and 200% modes. I was usually in 100 or 200% mode because I was painting and the scale of the image on the iPad made sense. Full screen between these particular two devices makes no sense, but it might for other people if their devices are closer in size (like a notebook for instance). 

I can't wait to see where they go with this. What if Apple comes out with rumored iPad Pro? what if Apple comes out with their own stylus? This could really change the graphics tablet game for professional use.

My advice is to give it a try for yourself. The app comes with a 7 day free trial and you can see if it would work for you. Let me know how it goes. 

 

Link: Google Earth Pro is Now Free

Over on the Google Lat Long blog, the latest news is that Google Earth Pro is now free. I use it on every single project for my virtual pre-site visits and for lots of contextual research. It helps me get a jumpstart on my design and has been an invaluable tool. It being made available for free is great news. It's better than the previously free version (just called Google Earth) for a few reasons:

  1. You can export super high res satellite imagery and use it as an underlay for your 3d modeling - up to 4800 pixels wide. The extra large exports come in handy when making large format presentation boards. Tip for underlays: turn off Terrain and 3d Buildings layers, and get a straight down view by Command+clicking and aiming straight down with your mouse.
  2. We can see property lines and US Parcel data. It includes lot size as well!
  3. Turn back time and see how neighborhoods have developed by skimming through old satellite photos with the Historical Imagery slider.
  4. Use the Path and Polygon tools to take measurements. Turn on the Ruler in the tool bar to see how long the paths you draw are.

Here are a few other things you can do with Google Earth; either free or pro versions:

  1. Find a 3d building that highlights in blue. You can click on them, and download them with textures from the SketchUp Warehouse. If you use SketchUp, they will come into your project geolocated and can be great 3d context.
  2. Use the heck out of Street View. Just keep double clicking all the way down to a street and you'll get into street view mode. Then click on the street in the direction you want to go, or scroll with your mouse wheel and drive down the street. 
  3. Take a trip to the Moon or Mars and explore. There is some crazy stuff embedded in those models.

These are the most valuable things I use it for on my projects. What do you use it for? Leave a note in the comments. 

End of 2014 Deal: 20% off my Maxwell Grass Preset Pro Pack

First, I want to thank you for visiting Method this year. It's my hope that you've learned new skills and workflows from my tutorials and articles. The world needs great architects, and it's my goal to make you worth more to the industry and to your clients. 

The Method website is a labor of love for me and I want to be able to even more in 2015. Please let me know how I can help you. Simply email me to do so


Get Grass!

Here at Method, I'm closing out 2014 by celebrating you! You've worked hard this year, and I want you to start 2015 off right with some new tools that will make you worth more. 

Until the end of the year, my Maxwell Grass Preset Pro Pack can be yours for only $79. That's 20% off the regular price. It works with Maxwell Render v2.7 and newer along with SketchUp or FormZ/Bonzai3d. 

How will the Preset Pack make you worth more?

  1. No more trial and error. You'll save huge amounts of time setting up grass in your scenes with my presets and photo-real materials. 
  2. You'll have a set of tools you'll be able to use time and time again, increasing your productivity in every scene you put grass into.
  3. Your renders will look even better than they already do, which will raise the value of your services, and allow you to charge more!

Watch this short video to see what it's all about:


Not only do you get the 17 presets and materials... buyers also get example scene files and TWO bonus training videos to take your skills to the next level.

This deal is good until midnight EST / 9pm PST. At that time the price goes back to $99.

So get that tax write-off before 2014 comes to an end and make yourself worth more with my Maxwell Grass Preset Pro Pack.

Happy rendering!




✱ Q&A: Using Maxwell Grass 'Level of Detail' for Faster Renderings

Here's something thing that comes up every once in a while regarding using the Maxwell Grass extension - running out of memory. Those grass blades can take up a lot of RAM when rendering and if your surfaces that you apply the grass to are too large, the rendering can be extremely slow and sometimes even fail. 

And no one likes a failed rendering!

So here's a quick screen shot of some settings to look into when using the grass extension called Level of Detail (LOD):
 


Use the L.O.D. settings in the Maxwell Grass extension to control falloff of grass blades.
Click to enlarge.


Basically if you set it up correctly, the further away your scene gets from the camera POV, less grass blades are generated. You can control how this works. This is a HUGE deal when your surfaces are expansive. The image above is an extreme example because of how closely I placed the distance values together, but it illustrates how they work. Once the distance passes the Minimum value, the blades falloff to the Maximum Density and then hold beyond the Maximum distance value. This way there is no step in the amount of blades, rather you get a nice even gradient from full to partial density.

There's a bit more info about this in the Maxwell Render help documents near the bottom of the page which you can reference here.

Use this tip to speed your renderings up drastically!

This tip also comes in really handy when using my Maxwell Grass Preset Pro Pack. Check it out and take your images to the next level if you're using SketchUp, FormZ or Bonzai3d.


This site is supported by people like you! Every dollar helps, and goes directly towards the costs associated with running the site and making more tutorials.

Donate

I love to meet new people. If you think someone else would like this article or site, please share this blog post by using the share buttons in the lower right corner of this blog entry. The more you share, the better this site can get. Thanks!

✱ Q&A: Troubleshooting Emitter Lights in Maxwell Render for Beginners

I got an email regarding troubleshooting emitter lights in SketchUp using Maxwell Render over the weekend, and I thought I'd post my answers here because it could be useful for you. Lots of people struggle with emitter lights when they're first starting out using Maxwell Render and think that the software just doesn't work when in fact you're probably just used to the brute-force techniques required by other rendering programs. I know this because that's what happened to me when I first tried using emitters in Maxwell. 

Read More