I used StyleGAN2-ADA to generate the original portrait, and then applied some special effects in Unreal.

Two videos:

The previous AI exercises I used StyleGAN or StyleGAN2, which required a lot of data sets and training time.

This batch changed to StyleGAN2-ADA, which was later released by Nvidia.
It supports small data sets, that is really a gospel for amateur players.

In addition, in terms of framework, Google’s Tensorflow was used before, but Facebook’s Pytorch was replaced this time.
There is no special reason. Seeing that Pytorch is being used more and more, just try it.

Ref:

StyleGAN2-ADA:…


This article introduces how to add the effects on Shadertoy into Spark AR Studio.

It has been several years since Spark AR Studio was launched, and its functions are relatively mature.


Steps:

  • Prepare a handsome photo.
  • Calculate the depth value of the scene in the photo through machine learning, and get a depth texture.
  • Write shader code to implement green scan line effect.

Calculate Photo Scene Depth

Why do we need to calculate the scene depth of the photo?

Because the photo is two-dimensional, if you directly use the color or grayscale of the two-dimensional image to simulate the three-dimensional effect, it often does not match the real world scene:


Last year today, I made a Christmas tree using Bender and Python:

This time I tested another one:

Export the model (Mesh) generated by the Sverchok plug-in or Python script from Blender, and then import it into the web page for rendering.


When the Mac is connected to a high-resolution 2K monitor, for example, the DELL U2518D in my hand has a default resolution of 2560*1440, which is stuffed into a 25-inch screen. The text is actually a bit small.


This project was about robotic arm with multi screens, (by ManaVR ✖ INT++) made in 2017.
In the early stage, I used MaxMSP Jitter with ABB’s RobotStudio to simulate the robotic arm and the large screen.

This article only focuses on how to use MaxMSP to do the simulation of the project prototype, making full use of the very convenient TCP communication, multi-screen motion simulation and other functional modules in MaxMSP Jitter.
In short, you know, in my hands, MaxMSP is not just the MaxMSP🙃.


The key steps of the video above:

  • Use the PoseNet of TensorFlow based on web-based machine learning for motion capture;
  • Link the PoseNet page to MaxMSP with the Node for Max module provided by MaxMSP;
  • Human motion data captured by PoseNet is sent back to MaxMSP through SocketIO;
  • MaxMSP sends the received data to Blender via OSC;
  • Blender uses the received data to control the deformation animation in real time.

Most about three topic:

  • Demonstrate PoseNet motion capture.
  • Data transfer between MaxMSP’s Node for Max and web pages (in fact, several previous examples are about this with video tutorials).

These days I’m learning machine learning and trying generating visual things based on StyleGAN, a neural network.

AILog005, was first sent to my wife, she said: “OK, more suppressed.”

“World is ahead” AILog.005

Great! This is the feeling I’m looking for. Not only do I specifically mean “suppression”, but I finally have a way of expressing myself, and it is “obscure”.


Blender is now a new force in 3D art. Although new but not young, about twenties.


The first article in 2020, accidentally picked an Old School topic.

There was a section in 名探偵コナン 戦慄の楽譜フルスコア(Detective Conan:Full Score of Fear), released ten years ago. Konan was standing in the middle of the water. First, a world wave shot down the phone receiver on the shore, then closed his eyes and shouted loudly. 110 alarm calls were broadcast remotely.

This time I will talk about how to use sound to make a phone call.
And advanced content — using sound waves as a carrier to play interaction and decode the DTMF signals.

Konan not only demonstrated the final effect…

AvantContra

Technical artist, computational art, generative art, interactive media, ex game/App/web developer. floatbug.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store