︎︎︎AR Filters
Tags :
︎ AR lenses, AR filters, Shaders, Spark AR, Effector
Role :
Creator
Tags :
︎ AR lenses, AR filters, Shaders, Spark AR, Effector
Role :
Creator
︎ 2020 07
![]()

︎ Project ‘Breakdown’
. this is a collection of AR filters that have been made using Spark AR and TikTok Effector using the Patch Editor and Javascript with the Reactive programming language.
. this is a collection of AR filters that have been made using Spark AR and TikTok Effector using the Patch Editor and Javascript with the Reactive programming language.
︎︎︎ Filters demo
︎ Filter 01 : Glitch
. the following effect shifts and rotate the RGB Modules of the camera texture and it has been entirely produced in the Patch Editor modifying the native RGB shifter module that spark provides
. rhe major change is in the “2D transform pack” where there is a continuous 360 rotation that has been added to 2 of the 3 different channels.
. the filter is named “alterEgo”
. the following effect shifts and rotate the RGB Modules of the camera texture and it has been entirely produced in the Patch Editor modifying the native RGB shifter module that spark provides
. rhe major change is in the “2D transform pack” where there is a continuous 360 rotation that has been added to 2 of the 3 different channels.
. the filter is named “alterEgo”

︎ Shared Code Sample :
Filter 02 : Audio
. the following effect takes advantage of the Reactive module interfacing with some values included in the patch editor
. on the right the commented .js file included in the project that target the interactive background material
. mainly the script does the following
1. create a reactive texture listening to an audio magnitude value coming from the microphone speaker of the device
2. assign a static gradient to the interactive pattern created via sdf
3. assign a pulsing gradient for the background
. in another script is assigned the same static gradient of the interactive background to the segmentation material targeting the person
. the filter is named “Artsy Beat”
Filter 02 : Audio
. the following effect takes advantage of the Reactive module interfacing with some values included in the patch editor
. on the right the commented .js file included in the project that target the interactive background material
. mainly the script does the following
1. create a reactive texture listening to an audio magnitude value coming from the microphone speaker of the device
2. assign a static gradient to the interactive pattern created via sdf
3. assign a pulsing gradient for the background
. in another script is assigned the same static gradient of the interactive background to the segmentation material targeting the person
. the filter is named “Artsy Beat”
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 | //============================================================================== // References // https://fb.me/timemodule // https://fb.me/audiomodule#audio-playback-controller // https://fb.me/tiled-sdf-patch // https://fb.me/bridging // Project setup: // 2D-3D audio visualizer that retrieves scalar signal values from the patch editor // and with a Signed Distance Field creates a texture that modifies the scale of the // pattern based on the audio input // there is a pulsing color as background image // there is a gradient color the generated SDF texture and on the user profile // remove the comments if you use it for personal project. // this script handles the audio shader //============================================================================== // REQUIRED MODULES const Materials = require('Materials'); const Reactive = require('Reactive'); const Shaders = require('Shaders'); const Time = require('Time'); const Patches = require('Patches'); const Scene = require('Scene'); //============================================================================== // assignments // get the audio value from the patch editor let myAudioValue = Patches.getScalarValue('AudioValue'); // get the material value from the patch editor (async function() { // Targeted Material var [material] = await Promise.all([ Materials.findFirst('material0'), ]); //============================================================================== // PARAMETERS // SDF circle parameters starter const center = Reactive.pack2(0.05,0.05); // set interactive radius let radius = 0.05; // radius = myAudioValue; // adjust the radius var radius2 = Reactive.mul(myAudioValue,0.07); // create a sdf circle with the previous center and radius const halfSize = Reactive.pack2(radius2,radius2); var sdfRect = Shaders.sdfRectangle(center,halfSize,{sdfVariant: Shaders.SHARP}); // COLOR // assign color to circle with mix sdf const sdfMix = Shaders.sdfMix(sdfRect,0,0.993); // set the repating parameters const pivot = Reactive.pack2(0.05,0.05); const size = Reactive.pack2(0.09,0.09); // repeat the circle using repeat sdf const sdfRepeat = Shaders.sdfRepeat(sdfMix,pivot,size); // create a step parameter const step = Reactive.step(sdfRepeat,0); // create the gradient const gradient = Shaders.gradient({"type" : Shaders.GradientType.VERTICAL}); // create two colors var color1 = Reactive.pack4(1,0.57,0,1); var color2 = Reactive.pack4(1,0.25,1,1); // create a mix of the two colors using a gradient const mix1 = Reactive.mix(color1,color2,gradient); // create a third color const color3 = Reactive.pack4(0.5,0,3,1); // set time parameters // PULSE const ct = Reactive.mul (Time.ms ,0.001); const curve = Reactive.abs (Reactive.sin(ct)); // Coordinate modulation of color based on time r,g,b,a const modulationColor = Reactive.pack4(0,0,curve,1); const finalColor = Reactive.mul (color3 ,modulationColor); // Create the second mix var mix2 = Reactive.mix(mix1,finalColor,step); // ASSIGN //============================================================================== const textureSlot = Shaders.DefaultMaterialTextures.DIFFUSE; material.setTexture(mix2, {textureSlotName: textureSlot}); })(); |
︎ Filter 02 : PatchEditor
.*to be noticed that the app can be tested only on device for if the sound input is the microphone speaker
.*to be noticed that the app can be tested only on device for if the sound input is the microphone speaker

︎ Filter 03 : Question
the following effect includes
. optimization and animation of 3d characters from third party software
. production of 2d graphics for including billboard particles
. a multiple random answer
. a touch input gesture to switch the character animations and with so also an audio playback controller for playing a soundtrack
. the filter is named “Very Next Date”
the following effect includes
. optimization and animation of 3d characters from third party software
. production of 2d graphics for including billboard particles
. a multiple random answer
. a touch input gesture to switch the character animations and with so also an audio playback controller for playing a soundtrack
. the filter is named “Very Next Date”

︎ Shared Code Sample :
Filter 03 : Question
. referencing the graph the script updates the text
Filter 03 : Question
. referencing the graph the script updates the text
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 | // Set random date and update Text Object //============================================================================== // REQUIRED MODULES const Scene = require('Scene'); const Patch = require('Patches'); export const Diagnostics = require('Diagnostics'); //============================================================================== // ASSIGNMENTS var textToUpdate= Scene.root.child('Device').child('Camera').child('Focal Distance').child('faceTracker0').child('baseCanvas').find('textUpdate'); var updatePulse = Patch.getPulseValue('updatePulse'); var day,month,year; // GET VALUES FROM PATCH day = Patch.getScalarValue('day'); month = Patch.getScalarValue('month'); year = Patch.getScalarValue('year'); // var finalDay, finalMonth, finalYear; finalDay = 0; finalMonth = 0; finalYear = 0; // var textFinal = ""; //============================================================================== // SUBSCRIBE SINGLE VALUES day.monitor().subscribe( val =>{ finalDay = val.newValue; }) month.monitor().subscribe( val =>{ finalMonth = val.newValue; }) year.monitor().subscribe( val =>{ finalYear = val.newValue; }) //============================================================================== // UPDATE updatePulse.subscribe( function (e) { textFinal = finalDay + " " + finalMonth + " " + finalYear; textToUpdate.text = " " + textFinal; }); |
︎ Tik Tok Filters
* Because of US website maintainace these filters are not online yet.
* Because of US website maintainace these filters are not online yet.
︎ Previous related work
In this AR installation from 2017 I focused on a workflow that included
. photogrammetry of 3D objects
. meshlab optimization of the objects
. maya creative assembly of the objects
. rendering
. z-brush optimzation for the mesh object
. substance painter for texturing
. unity for visualizing the assets in AR
︎see full design portfolio featuring 3D modeling work, robotics, photogrammetry, AR and more.
In this AR installation from 2017 I focused on a workflow that included
. photogrammetry of 3D objects
. meshlab optimization of the objects
. maya creative assembly of the objects
. rendering
. z-brush optimzation for the mesh object
. substance painter for texturing
. unity for visualizing the assets in AR
︎see full design portfolio featuring 3D modeling work, robotics, photogrammetry, AR and more.
