Generating Watercolors with AI

We all have seen how artificial intelligence (AI) has been used with photography to do things like fix photographs in Adobe Photoshop, apply fun filters to photographs on Instagram, or transform a photograph to embody the style of a famous painting. The last example, style transfer, uses an AI model called a Generative Adversarial Network (GAN) and can also be used to generate entirely new images based on training it with a collection of images. This is what blew my mind this week.

I am part of a global organization at Microsoft, Commercial Software Engineering (CSE), where we help partners and customers with their most complex software engineering challenges by coding with them. The team is made up of software engineers, data scientists, and program managers like me who all love to hack and build stuff. We just completed a week-long hackathon combining virtual and in-person teams where individuals proposed ideas that interested them to work on and explore – in order to build a team to build something in less than four days.

Meeting Paul

Michael and Paul

Paul Butler, a software engineer, recent Computer Science graduate, and CSE team member, proposed an idea to use a Style GAN 2-ADA to generate artwork. I had been learning about GANs and my colleague Kevin Ashley just published a book on creating art with AI – and that really peaked my interest. I had never met Paul before, but we quickly found a common thread and interest – as we both went to college in Arizona: he’s an ASU Sun Devil and I’m a UA Wildcat – and he’s super smart. Paul knew how to run the GAN using Python but he didn’t have a training image set to work from – and that’s where I was able to help.

30 Years of Artwork

Ever since I started college studying architecture, I have been keeping visual journals of my artwork, taking a journal and compact supplies with me to document my journey when I travel for work or leisure. Having done this consistently for thirty years, I have filled more than 20 journals with drawings, watercolors, collage, pop-up craft, stickers, circuitry, and photographs. When the pandemic started and I had some time where I couldn’t travel, I scanned and categorized the pages using Adobe Photoshop Lightroom, adding dates, locations, and tags to the images identifying media and medium. Ending up with a collection of over 2,000 journal pages, I had tagged data set of imagery. I proposed to Paul that we use a subset of the pages, 315 of my watercolors, to train the GAN to make an AI that could create watercolors.

The Training set of 315 Watercolors

Watercoloring

I had started watercoloring when I was 18 on a summer trip with Chuck Albanese, one of my architecture professors to sketch and watercolor in Italy and Greece. Watercoloring starts out hard as you learn to understand how to control the water and color, but like every skill you can get better at it with practice.

Pantheon, Rome in 1989
Joshua Tree, CA in 2020

I had just finished my first year of architecture school where I learned from Kirby Lockard how to accurately draw freehand perspective, which is crucial in watercolor as a painting often start with a base drawing in pencil or pen. I loved watercoloring because it was portable and quick so I could take my creativity with me wherever I went and create something in as little as 15 minutes.

Travel watercolors during COVID lockdown via Microsoft Flight Simulator

Journals as Memento Collections

Ever since that trip, for more than thirty years, I’ve been sketching and watercoloring and with the Grail Diary from Indiana Jones and the Last Crusade as inspiration, I started using journals to record my journey, thoughts, and ideas and to collect the ephemera that I gather along the way. What I found is that while I am sitting somewhere creating artwork to go on a page of my journal, my mind is recording everything around me, the people I meet, the conversations I have, what else I observe, and my feelings. The artwork turns into a mnemonic for the moment, a memento. My journal is a serial collection of personal mementos that I can openly share with others, knowing that I don’t have to write a personal narrative to record it. When I look at a page in my journal years later, I instantly recall the experience from years earlier.

Adobe HQ, Noida, India

Training the GAN

Paul started working on building the Python code to generate images and I started refining my watercolor collection, isolating the images on pages that had other content like line drawings, collage, and making a consistent collection of watercolors. Over the course of more than thirty years my style has evolved so there is a huge amount of variability in the collection, with the only common thread being that they were all watercolors and all created by me. Paul wrote the code so that a single numeric value from 0-999, a random number seed, would be the only input variable to generate each image. He ran the AI model for a few hours and we started seeing results come out of it and I was blown away!

#0278

Mementos in the Images

The AI found patterns and my techniques in the watercolor collection and started creating new unique abstract images that instantly triggered memories in me. I not only recognized elements in the images but those images triggered memories of the experiences that I had when creating them, combining these memories in a way that only happened for me in dreams. These images were mementos that spanned time and space.

#0780

The image above, #0780 reminds me of the time that I took my son Sam, then 12, to a nude figure drawing session at the local art store five years ago, where we both spent the evening sketching and painting. As you might imagine, it was a memorable evening, and seeing this image, I was brought right back there. As many of my watercolors over the years have been of monumental architecture, there is a heavy influence on the AI model for that kind of subject. In this generated artwork, I also see a hint of the architecture of Antonio Gaudi, one of my favorite architects whose work is the subject of a number of the paintings in the training model.

#0216

My Style in the Images

In the more than 1,000 generated images, I saw my style, composition, coloration, linework, and brushwork, but in very abstract forms, much more abstract than I’m comfortable with consciously doing in my work which today would be best characterized as Urban Sketching. Paul tells me that we can do much more to increase the variability, improve the quality, and refine the model. You will likely notice a recurring theme in many of the images of a monumental building on the right of the image; this is either an artifact of the AI model, or a hidden propensity for me to paint buildings on the right side of a watercolor – definitely an interesting direction for investigation. Paul is anxious to build a model against my whole corpus of artwork, which sounds like a very cool idea.

#0910

Putting the Artwork in a Gallery

I took more than 200 of the generated images and put them in a virtual art gallery using my Galeryst site. Generated artwork in a generated gallery seems very appropriate. I originally created Galeryst to share my journals with others as bound journals don’t typically exhibit well in physical galleries.

Generated Watercolor Artwork on Galeryst

Sharing the Entire Collection

Generated Watercolors in Lightroom

I’ve shared the entire collection of generated watercolor artwork using Adobe Photoshop Lightroom because I want your feedback on the images. You can also see a slideshow of the entire collection on the Lightroom site as well if you click on the … in the upper right of the page. Here is the feedback I would love from you:

  • If you like an image, click on the ❤️ heart button in the lower left corner.
  • If an image reminds you of something or somewhere, please click on the 💬 comment button in the lower left and leave a comment.
#0843

On Adobe Photoshop Lightroom Mobile, there’s a cool feature to “Choose Best Photos”. I ran the analyzer on the collection with a quality threshold of 16, and these are the top 16 that it picked:

Best 16 Photos as Chosen by Adobe Photoshop Lightroom

Pretty cool, right?

Galeryst Beta Applications Open

Galeryst: Curate your art exhibition
Galeryst: Curate your art exhibition

Galeryst is a new site that builds 3D galleries from your Adobe Photoshop Lightroom albums. We are looking for Lightroom users who are interested in trying it out in a private beta test before the site launches publicly. If you are interested, go to https://galeryst.com to apply to the beta program.

Mars Perseverance Landing with MakeCode Arcade

I look back at one of the first video games that I remember having fun playing: it was Lunar Lander created by Atari. Not only was it a game in the arcade which cost a shiny quarter per play, but it also ran on the TRS-80 computers in my schools computer lab which were free for me to use. I liked that deal especially as I was learning how to use BASIC programming to make the pixels move on the screen of those computers. For me, and many kids of my generation, computer games, very basic computer games, were our draw to computers. I poured through computer magazines which had listings of the BASIC code for games that I typed in, line after line. Debugging was going through the code again line-by-line until I found each of my typos. I then started on my own ideas: using a for/next loop, I was able to make a spaceship fly across the screen just like I saw the Kirk’s Enterprise accelerate to warp speed. I was hooked.

That’s how I started coding- that’s why I started my path in software at the age of 11 with games like Lunar Lander.

Lunar Lander

When I was sent an AdaFruit PyBadge by a colleague in December to experiment with, I immediately thought back about the games like Lunar Lander that inspired me as a young boy to start coding.

Adafruit PyBadge for MakeCode Arcade, CircuitPython, or Arduino
Adafruit PyBadge

The Adafruit PyBadge is a mini $34.95 computer that you can code with MakeCode Arcade, CirciutPython, or Arduino. You write the code on a computer and download it to the device which has a small color screen, buttons, lights, sensors, and a speaker, and various connectors to enable all sorts of other circuitry. I immediately thought of Lunar Lander but I also heard about the NASA Perseverance mission to Mars that was underway and thought I might try to create something similar for the Perseverance mission by the time it lands on Mars on 2/18/2021.

Once I started researching the mission, I realized that it was way more complex than the lunar lander. My challenge was to make a game around the landing of this rover on Mars. Here are the steps:

  1. Capsule enters Mars atmosphere and decelerates with heat shield
  2. Capsule slows down with parachute
  3. Heat shield ejected
  4. Lander drops out of capsule and starts rockets
  5. Lander gets close to surface and lowers rover to ground with cables
  6. Lander flies off
  7. Rover starts it mission to explore Mars looking for signs of ancient life.

My mission was to create a game around that so I started building it with MakeCode Arcade, using the drag and drop interface to make something fun. MakeCode is a web-based programming environment for kids that can be used to program MineCraft, hardware devices like the BBC micro:bit, Lego Mindstorms and games. My first experience with MakeCode was to animate the BBC micro:bit on the bag I use for my journaling/art supplies, which I’ve shared on Thingiverse. MakeCode Arcade is a version of MakeCode that makes it easy to build games with sprites, animations, and interactivity. The beauty of MakeCode is that you can switch between the graphical block-based programming and the code view to see that they do the exact same thing – a great way to “graduate” to text-based coding.

I was able to get pretty far but I ran out of time as the actual Perseverance rover will land on Mars in two days. I’ve shared the source code so anyone could try it out and use it as a starting point for their own experimentation. The amazing part about the Perseverance mission is that the whole landing sequence will have to be done via computers without direct human controls, most likely with artificial intelligence, since the time it takes for radio signals to travel between Mars and the Earth is between 4.3 and 21 minutes.

The gameplay is this: once the lander is detached from the capsule, use the down arrow to slow the descent. Once close to the surface, press the A button to release the lander. The lander can then explore the surface of Mars by pressing the left and right button, pressing A again to send a pulse looking for water underground. That’s as far as I got with the time that I had. Anyone is free to tinker, modify, and adapt it, just please share with me what you do with it. I’d love to see where this goes.

I think that the possibilities for kids today to learn coding and build fun games that can be loaded onto a tiny computer is so cool.  The block-based programming makes is so easy to learn the basic concepts of coding and creating fun games.   I printed a basic case for the PyBadge with my 3D printer and here’s my Perseverance game playing on it.

Perseverance Game on PyBadge in 3D printed case.

What can your young coder create today?

Animating your Web Meeting Experience

Like many of you in the pandemic, many of the hours of my days are spent in web meetings. For me it’s a combination of Teams, Zoom, WebEx, and Google Hangouts. Turning cameras on really helps to connect to everyone, even though it’s through a small array of dancing pixels. I’ve had fun using tools like Adobe Character Animator and OBS Studio with a reMarkable tablet to turn my camera feed into something a bit more interesting.

My COVID-19 Puppet in Adobe Character Animator
Using a reMarkable tablet with OBS Studio as a transparent whiteboard

Using Adobe Illustrator, I’ve made a custom frame that I’ve started using in OBS Studio to express myself, similar to how I might use my attire or decorate my workspace in an office setting. I apply a Chroma Key filter in OBS studio

My frame created in Adobe illustrator and saved as a .png image.

I then composite a number of text and image elements onto that have meaning to me – including a slideshow of my artwork. It’s been a great conversation starter before the start of a meeting. But I wanted to do a little bit more….

The frame and other sources in OBS Studio

Adding Animation

I’ve always thought that adding a little bit of animation might be fun with the free Microsoft Photos app. If you are involved in selling or talking about real world items or even virtual ones, there is a great opportunity to share animated 3D models of those items in your camera feed as well. Here’s how I did it:

The frame in the free Microsoft Photos app.
  1. I opened my frame .png image with the Microsoft Photos app and selected Edit & Create…Create a video with Music.
  2. I named the video Animated Frame and pressed OK.
Naming the project so I can find it later.
  1. I tapped on the 3.0 timespan on the first frame in the storyboard and changed the timespan to 10 seconds.
Editing the duration
  1. I tapped on the 3D effects button to open the 3D Effects pane
Selecting 3D effects
  1. In the 3D library tab, Sci-Fi & Fantasy group, I selected the Landing UFO and it was imported and showed up on my frame.
Adding a Landing UFO to my frame.
  1. I then dragged and resized the UFO to the upper corner of the frame.
Repositioning and resizing the UFO.
  1. In the pane, I changed the quick animation to Hover and reduced the volume to 0. I also dragged the timeline span to cover the whole duration of the video.
  2. I wanted to add one more effect, so I clicked on the Effects tab and added Plasma sparks, then moving it to the upper right and reducing the volume to 0. I also changed the timespan to cover the whole video duration.
Adding the plasma sparks.
  1. Now that I was done with that I clicked on the Done button and then the Finish video button, selecting the High 1080p video quality and pressing Export.
  2. I saved the file to my computer as Video Frame.mp4, resulting in this video
The final video that I will use.
  1. Next in OBS studio, I added a new Media Source called Video Frame, selecting my Video Frame.mp4 as the source file and checking the Loop button.
Adding the video as a media source in OBS Studio
  1. I added a Chroma Key filter to the Video Frame media source
Adding the Chroma key – you may need to adjust the settings to have the best effect for your animations.

And I have an animated saucer and plasma sparking in my camera feed once I press the Start Virtual Camera button in OBS Studio. The camera will then show up in my list of available cameras as OBS Virtual Camera. I have the frame image right behind the video frame image so I can turn off the animations if they get too distracting.

In OBS Studio with all of my sources ready to run a Virtual Camera
Resulting video composite.

The way that I look at it, If all people see of me is a rectangle of pixels the size of a credit card, I want all of those pixels to count. Please share what you create to make your web meetings more fun.

Using a reMarkable Tablet in Web Meetings

Scott Hanselman posted a video earlier this month that gave me an idea. He showed how he used OBS studio and Microsoft Whiteboard to do a transparent glass whiteboard in Microsoft Teams and I saw in it an interesting way to use my reMarkable tablet to do something similar.

reMarkable Tablet

Working from home, like many of you my primary device for work is a desktop computer which does not have the drawable surface and pen of a Microsoft Surface device. Recently, I got the new reMarkable 2 paper-like tablet and I really like it. The device works with a dedicate desktop app that, in addition to helping synchronize notebooks, it has a Live View capability where the display on the computer is in sync with the tablet and update every time the page on the tablet drawn on. I was able to take the live output from the reMarkable app on my PC as an input source in OBS Studio and replicate the transparent glass whiteboard effect that I could use during Microsoft Teams meetings. If you aren’t familiar with it, OBS Studio is a free, open-source application for Windows, macOS, and Linux that you can use to stream and record from your computer, mixing video, desktop windows, audio sources, and graphics. Here is how I did it:

  1. On my reMarkable 2 tablet, I added a new page to a notebook, using the blank page template and set the orientation to landscape.
  2. On my PC, I started the reMarkable app
  3. On my reMarkable tablet, I turned the LiveView (Beta) option on in the share menu
  4. Once I did, that I was prompted on the app on my PC to accept the LiveView request. At that point, the app’s screen mirrored my tablet.
  1. In OBS Studio, I added a Video Capture Device for my webcam and stretched it to the size of the screen output.
  2. I then added a Window Capture source, selecting the reMarkable app as the Window.
  1. Now the whiteboard is positioned over the video capture device. Before I resize it, I need to crop the edges.
  1. I drag the edges of the new Window Capture element with the Alt key pressed to crop out the frame and chrome around the whiteboard.
  1. Now I resize the whiteboard so it covers my Video Capture Device.
  1. The last thing I do is add filters to the Window Capture to make it all work. Select the Window Capture source, right click and select Filters…
  2. Add a Color Correction filter to make the white background green.
  1. Add a Chroma Key filter to remove the background. You may need to adjust the Similarity value if you use gray pens on the reMarkable.
  1. Add another Color Correction filter to make the black text white.

And then you can stream, record, or use OBS as a Virtual Camera in your online meetings. The whiteboard drawing from your tablet is saved and you can easily export it to a PDF or image and it is in the video as well.

A transparent whiteboard drawing with a reMarkeable tablet.

Give it a try today and add more drawing to your online meetings!

Limitations

Some users have pointed out a limitation to the LiveView (Beta) feature in the reMarkable app where erasing on the tablet does not immediately erase on the LiveView. A quick fix to refresh the LiveView is to tap the Full-screen button in the lower right corner of the app to make the app go full screen, and then tap it again to go back to the original size. This triggers a refresh of the LiveView with the erased ink. I reported this bug to the reMarkable team.

TechCreativeCoaching.com

The past few years I have been doing coaching and mentoring and have found it greatly rewarding helping others. Most people were finding me through my work at Microsoft and LinkedIn but I thought it was time that I launched a website focusing on it. My specific specialty is helping people combine their creative passions with their love of technology. Take a look today at TechCreativeCoaching.com.

Take a look at what my clients have to say, the reading list, and the video list. I’d love your feedback, and if you want to book an appointment to discuss your career, please fill in the form on the home page and I’ll get back to you.

Virtual Flight Sketching

The New Microsoft Flight Simulator has opened a new location for me to take my sketching: anywhere in the world. One of the first games that I played on my first computer, an IBM PCjr was Microsoft Flight Simulator in the 1980s and that started my journey into computation with a fascination of a three-dimensional environment represented on a flat screen. The technology has advanced amazingly since then and so have my drawing skills.

San Francisco Bay from my Aviat Pitts Special S2S drawn with Adobe Fresco

The imagery and geometry that is now in Microsoft Flight Simulator is very accurate, lifelike, and for me, an urban sketcher, good enough to sketch. The application give me the foreground, an airplane cockpit, the midground, buildings and geology, and the background of scenic vistas with accurate weather rendering.

Sydney Opera House from my Icon A5 drawn in my journal

I pick a location, an airplane, and fly to get just the right point of view, then I press [Pause]. I then start sketching in my journal from Iona Handcrafted Books, Adobe Fresco, or even my Sketch 360 app. Since these sketches aren’t from real life, I shouldn’t call them Urban Sketches, so I’ve decided to call the Virtual Flight Sketches with the hashtag #VirtualFlightSketch.

Manhattan at Sunset from my JMP VL-3

I’ve always wanted to see the pyramids of Egypt.

Giza Pyramids an Sphinx from my JMP VL-3

I created my latest Virtual Flight Sketch with my Sketch 360 app and exporting as an animation video that you can interact with.

San Francisco from EX ZLin Savage Cub

For this sketch, I had the Flight Simulator in the left screen and Sketch 360 running on the Wacom One display tablet for the drawing canvas and on the right display where it showed the 360 view.

360 Sketching studio setup.

The funny thing about pausing in Flight Simulator is that the plane stops in mid-flight but the clock does not stop. This means that if I’m doing a sketch at sunset, the lighting is going to change during the time of my sketch. It adds a realistic aspect to the experience. I know that I could easily take a screenshot and work from that, but I choose not not.

Chicago at Sunset in Icon A5 Virtual Flight Sketch
Chicago at Sunset in Icon A5 Virtual Flight Sketch

Where should I fly for my next #VirtualFlightSketch ?

Analog + Digital 360 Sketching Workshop

Workshop lead by Michael S. Scherotter, Creative Experience Engineer at Microsoft
Codame Art+Tech Festival
GitHub, San Francisco
November 26, 2019 2-5 PM

Come learn a new fun way of sketching to create immersive panoramic drawings from a single point of view looking in all directions. This workshop will cover both drawing by hand on paper aided by an equirectangular projection grid as well as with digital tools. See examples here of 360 Sketches Created by the Instructor, Michael Scherotter.

  1. Introduction to Equirectangular Projections
  2. Exercise: Drawing on Equirectangular Grid Paper
  3. Photograph drawings, crop, and load into tablets & VR Headsets (like Oculus Go)
  4. Demonstrate 360 Drawing with Sketch 360
  5. Exercise options
    1. Make 360 sketches with Sketch 360
    2. Make 360 sketches with any drawing app using Equirectangular Grid underlay
    3. Continue Paper 360 drawing
  6. View results in tablets & VR Headsets

Resources

Equirectangular Projection Grid

Sign up Today: there’s still space!

Sign up

Analog + Digital 360 Sketching Workshop at CODAME Festival and Creative tool Chains

On October 26 2019, I will be leading a workshop on analog+digital 360 sketching at the CODAME Art+Tech Festival in San Francisco, CA. The festival is 3-day conference about the intersection of art and technology with many mind-expanding workshops and exhibitions helping people see how artists, musicians, technologists, and researchers are fusing technology to artistry and creativity.

Ever since studying architecture in college, I’ve thrust myself into this intersection focusing my passion on creative tools both analog (journals) and digital (computer tablets) and using the digital to make the analog (3D printing). People have often asked me on how I draw the line: will I ever give up my journaling? Which do you like better analog, the fountain pen, or the digital pen? It’s not simple but recently, I’ve clarified it by asserting that for the digital mediums, I focus on things I can’t do easily with my analog tools of pens, pencils, watercolors, knifes and glue sticks.

Recently I’ve been focusing on a specific type of medium that I’ve created the app Sketch 360 for, and that’s the domain of the workshop on Saturday at the CODAME festival. I’m going to start the workshop with pens and paper, helping the participants understand the equirectangular projection that 360 sketching is based on and then move to digital tools.

Sketch 360

Creative Tool Chains

The aspect of creativity that has always interested me is the process that people take in their creative endeavors and the tools that people use in this process. After my architecture degree (BArch), I got master’s degree (MArch) in design tool development and have been passionate since then in crafting tools to help people be more creative. I have done experimenting recently in creating chains of creative tools like this:

  1. Using OpenSCAD (a creative tool that lets you write code to make 3D models that can be printed on a 3D printer) to design a parametric watercolor kit.
  2. Hosting that model on Thingiverse where people can customize (a creative tool) it and output a model to their exact specs that they can print
  3. Using the watercolor kit that I printed (a creative tool), to paint a watercolor
Watercolor kit

In this way, I look at a watercolor brush and code both as creative tools. Sometimes the chain has parallel links:

  1. Creating a stand for my GoPro camera that allows me to record video while creating artwork
  2. Pens and a journal creating a drawing in an airport
  3. Creating a song with a Teenage Engineering sequencer while creating the drawing
  4. Creating a video of the process.

I’m looking forward to going to the CODAME Art-Tech festival because I want to meet others who look at creativity and technology like I do: people who want to mix it up, bounce ideas, experiment, try new things. DM me at @Synergist if you want a discount code for the festival.

See you there – create with me!