new xglsl()
- Source:
Methods
(static) cubeVoxelGeom(vparas) → {THREE.BufferGeometry}
Get geometry buffer for cube voxels, with attributes that the shader can work,
each vertex has a random color, noise and size.
The uniform names are color, size (default by Three.js) and a_noise.
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | u_sects: [w, h, d] segements in 3D |
Returns:
the geometry buffer.
- Type
- THREE.BufferGeometry
(static) cubeVoxels(paras) → {object}
Create vertex & fragment shaders that can morphing between multiple box positions.
Parameters:
Name | Type | Description |
---|---|---|
paras |
object | paras.uniforms.u_cubes array of boxes param.uniforms.u_box1, 2, ... vec3 for position index (voxel index, not position) param.uniforms.u_morph, morph animation, 0 - 1 |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) flameLight(vparas)
Example: See docs/design memoe/shader samples
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | visual paras, same as Visual.paras |
- Source:
(static) flameLight(paras) → {object}
Example: See docs/design memoe/shader samples
Parameters:
Name | Type | Description |
---|---|---|
paras |
object | paras.vert_scale: point scale |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) fragShape(paras) → {object}
https://thebookofshaders.com/05/
Parameters:
Name | Type | Description |
---|---|---|
paras |
object | paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) hasUAlpha(shader) → {bool}
Is the shader support u_alpha? Shaders include AssetType.point, refPoint, GeomCurve, colorArray
Parameters:
Name | Type | Description |
---|---|---|
shader |
ShaderFlag |
- Source:
Returns:
- Type
- bool
(static) initPhongUni(uniformsopt, light, parasopt) → {object}
A common functionf for initialize phong uniforms. Supposed to be changed in the future - phong uniforms should not very common for different shader?
Don't use this directly. This is only a shortcut of certain shaders.
Parameters:
Name | Type | Attributes | Description |
---|---|---|---|
uniforms |
object |
<optional> |
if undefined, will create one |
light |
object | ||
paras |
paras |
<optional> |
usually Visual.paras |
- Source:
Returns:
uniforms
- Type
- object
(static) orbGroups(vparas) → {object}
Get shader of a sequnce of orb groups, each for a wpos - an vec3 array, with respect to vertex dir & cent attribute.
Test page: test/html/map3d/geopaths.htmlShaderFlag: worldOrbs
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) particlesGeom(vparas, meshSrc, meshTarget) → {THREE.BufferGeometry}
Parameters:
Name | Type | Description |
---|---|---|
vparas |
Visual.paras | |
meshSrc |
TREE.Mesh | |
meshTarget |
TREE.Mesh |
- Deprecated:
- this function can be completely covered by cubeVoxelGeom(). Create geometry buffer from target mesh. If shader type is randomParticles, the buffer also has attributes color and size.
- Source:
Returns:
- Type
- THREE.BufferGeometry
(static) phongMorph2(vparas, paras) → {object}
Get a shader that can be morphed with colors and textures.
Test page: test/html/morph-color.html
ShaderFlag: colorArray
See test page for different mixing mode results.
referencing implementation @ cs.toronto.edu
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | {
texMix: vparas.texMix - ShaderAlpha.multiply | additive | mix, see |
paras |
object |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) pointGeom(point) → {THREE.Geometry}
Get points geometry buffer for simulating flowing path.
Parameters:
Name | Type | Description |
---|---|---|
point |
THREE.Vector3 |
Returns:
point geometry
- Type
- THREE.Geometry
(static) randomRarticl(flag, vparas) → {object}
Parameters:
Name | Type | Description |
---|---|---|
flag |
int | @see ShaderFlag |
vparas |
Visual.paras | @see Visual |
- Source:
Returns:
{vertexShader, fragmentShader}
The shaders for THREE.ShaderMaterial (using variables supported by Three.js)
where
return.vertextShader {string}
return.vertextShader {string}
- Type
- object
(static) randomRarticl(paras) → {object}
Get shader for ShaderFlag.randomParticles. If u_morph is animated, must providing uniform vec3 & a_target. Used variables: position, color, size. gl_position = mix(pos, taget, morph) + noise * dist;
Parameters:
Name | Type | Description |
---|---|---|
paras |
object | paras.u_dist {float} in world paras.u_morph {float} paras.a_dest {vec3} in world paras.a_noise {float} paras.size_scale {float} |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) scaleOrb(vparas) → {object}
Get shader of scaled orb, with respect to vertex dir & cent attribute.
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) testPoint(parasopt) → {object}
get shader of gl_point for debugging
Parameters:
Name | Type | Attributes | Description |
---|---|---|---|
paras |
object |
<optional> |
paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) testPoint(parasopt) → {object}
get shader of gl_point for debugging
Parameters:
Name | Type | Attributes | Description |
---|---|---|---|
paras |
object |
<optional> |
paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) testPoint(parasopt) → {object}
get shader of reflector
Parameters:
Name | Type | Attributes | Description |
---|---|---|---|
paras |
object |
<optional> |
paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) texPrism(vparasopt)
Render prism extruded from xz polygon, with texture on roof and leteral faces.
Parameters:
Name | Type | Attributes | Description | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
vparas |
object |
<optional> |
Visual.paras Properties
|
- Source:
- See:
-
- glx.boxLayers for a try to shade building floor without the floor texture.
(static) thermalTile(vparas) → {object}
Get shader of thermal tiles.
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) tileOrbs(vparas) → {object}
It's supposed look like Thermal Particles III @shadertoy. And uv distortion can be done like Texture twistery @shadertoy, with the help of color re-shaping:
vec2 getDistortion(vec2 uv, float d, float t) { uv.x += cos(d) + t * 0.9; uv.y += sin(d + t * 0.75); return uv; }vec4 getDistortedTexture(sampler2D iChannel, vec2 uv) { vec4 rgb = texture(iChannel, uv); return rgb; } void mainImage( out vec4 fragColor, in vec2 fragCoord ) { vec2 uv = fragCoord.xy / iResolution.xy; float t = iTime * 0.125; vec2 mid = vec2(0.5,0.5); vec2 focus = iMouse.xy / iResolution.xy; float d1 = distance(focus+sin(t * 0.125) * 0.5, uv); float d2 = distance(focus+cos(t), uv); vec4 rgb = (getDistortedTexture(iChannel0, getDistortion(uv, d1, t)) + getDistortedTexture(iChannel1, getDistortion(uv, -d2, t))) * 0.5; rgb.r /= d2; rgb.g += -0.5 + d1; rgb.b = -0.5 + (d1 + d2) / 2.0; // rgb.a = pow(rgb.a, 0.125); // rgb.g = pow(rgb.g, 0.5); rgb.r = pow(rgb.r, 1.5) * (sin(t * 10.) + 1.) * 0.125; rgb.b = pow(rgb.b, 1.5) * (sin((t + 5.2) * 10.) + 1.) * 0.125; fragColor = rgb; } </pre>
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) updatePhongUni(uniforms, lightopt, parasopt) → {object}
A common functionf for undating phong uniforms. Supposed to be changed in the future - phong uniforms should not very common for different shader?
Don't use this directly. This is only a shortcut of certain shaders.
Parameters:
Name | Type | Attributes | Description |
---|---|---|---|
uniforms |
object | uniforms to updated. |
|
light |
object |
<optional> |
|
paras |
paras |
<optional> |
usually Visual.paras |
- Source:
Returns:
uniforms
- Type
- object
(static) worldOrbs(vparas) → {object}
Get shader of a sequnce of orbs, with respect to vertex dir & cent attribute.
Test page: test/html/map3d/geopath-road.htmlShaderFlag: worldOrbs
uniform float r; uniform float whiteAlpha; uniform sampler2D u_tex; uniform vec3 wpos; uniform vec3 offsets[${orbs}]; uniform vec3 orbScale; uniform float r[${orbs}]; uniform vec4 orbColors[${orbs}];attribute vec3 a_tan; attribute vec3 a_pos;
sub function:
vec2 sdEllipsoid(): Vector distance to orignal point.
param vec3 eye - raycast origin
param vec3 u - ray direction
param float r - orb radius
param vec3 cnetr - orb center
param vec3 abc - orb x y z scale
return vec2 distances where ray intercecting ellipsoid; less than zero if not.
see https://math.stackexchange.com/questions/778171/intersection-of-ellipsoid-with-ray
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | paras.vert_scale [optional] number scale of vertices |
Returns:
{vertexShader, fragmentShader}
- Type
- object
(static) xyzLayer2(vparas) → {object}
Rendering tessellated planes in a box model.
Issue:The problem is it can't finding out distance to a polygon to restrict the floor area. There is an example by Inigo Quilez [2] that can figure out distance quickly in fragment shader but the problem is we can't find out an efficient way to send polygon info into fragment with webgl[3].
Reference:
[1] Da Rasterizer, Example of fragment matrix operation.
Simplified Example:
#define WEIGHT (12.0 / iResolution.x)// rasterize functions float line(vec2 p, vec2 p0, vec2 p1, float w) { vec2 d = p1 - p0; float t = clamp(dot(d,p-p0) / dot(d,d), 0.0,1.0); vec2 proj = p0 + d * t; float dist = length(p - proj); dist = 1.0/dist * WEIGHT * w; return min(dist*dist,1.0); } // matrices mat4 getRotMatrix(vec3 a) { vec3 s = sin(a); vec3 c = cos(a); mat4 ret; ret[0] = vec4(c.y*c.z,c.y*s.z,-s.y,0.0); ret[1] = vec4(s.x*s.y*c.z-c.x*s.z,s.x*s.y*s.z+c.x*c.z,s.x*c.y,0.0); ret[2] = vec4(c.x*s.y*c.z+s.x*s.z, c.x*s.y*s.z-s.x*c.z, c.x*c.y,0.0); ret[3] = vec4(0.0,0.0,0.0,1.0); return ret; } mat4 getPosMatrix(vec3 p) { mat4 ret; ret[0] = vec4(1.0,0.0,0.0,p.x); ret[1] = vec4(0.0,1.0,0.0,p.y); ret[2] = vec4(0.0,0.0,1.0,p.z); ret[3] = vec4(0.0,0.0,0.0,1.0); return ret; } void mainImage( out vec4 fragColor, in vec2 fragCoord ) { vec2 uv = fragCoord.xy / iResolution.xy; uv = uv * 2.0 - 1.0; uv.x *= iResolution.x / iResolution.y; float line_width = 0.4; float time = iTime * 0.31415; vec3 c = vec3(mix(vec3(0.19,0.13,0.1),vec3(1.0), 0.5*pow(length(uv)*0.5,2.0))); mat4 cam = getPosMatrix(vec3(0.0,0.0,10.0)); mat4 rot = getRotMatrix(vec3(time,time*0.86,time*0.473)); vec3 instances[2]; instances[0] = vec3( 0.0, 0.0,-1.0); // box pipeline for(int dip = 0; dip < 2; dip++) { // input assembly vec3 vert[8]; vert[0] = vec3(-1.0,-1.0, 1.0); vert[1] = vec3(-1.0, 1.0, 1.0); vert[2] = vec3( 1.0, 1.0, 1.0); vert[3] = vec3( 1.0,-1.0, 1.0); vert[4] = vec3(-1.0,-1.0,-1.0); vert[5] = vec3(-1.0, 1.0,-1.0); vert[6] = vec3( 1.0, 1.0,-1.0); vert[7] = vec3( 1.0,-1.0,-1.0); // vertex processing mat4 pos = getPosMatrix(instances[dip] * 4.0); mat4 mat = pos * rot * cam; for(int i = 0; i < 8; i++) { // transform vert[i] = (vec4(vert[i],1.0) * mat).xyz; // perspective vert[i].z = 1.0 / vert[i].z; vert[i].xy *= vert[i].z; } // primitive assembly and rasterize float i; i = line(uv,vert[0].xy,vert[1].xy,line_width); i += line(uv,vert[1].xy,vert[2].xy,line_width); i += line(uv,vert[2].xy,vert[3].xy,line_width); i += line(uv,vert[3].xy,vert[0].xy,line_width); i += line(uv,vert[4].xy,vert[5].xy,line_width); i += line(uv,vert[5].xy,vert[6].xy,line_width); i += line(uv,vert[6].xy,vert[7].xy,line_width); i += line(uv,vert[7].xy,vert[4].xy,line_width); i += line(uv,vert[0].xy,vert[4].xy,line_width); i += line(uv,vert[1].xy,vert[5].xy,line_width); i += line(uv,vert[2].xy,vert[6].xy,line_width); i += line(uv,vert[3].xy,vert[7].xy,line_width); c += clamp(i, 0., 1.); } fragColor = vec4(c,1.0); } </pre>
See also another interesting example.
[2] the Winding Number Algorithm.
To find out distance to a polygon in a plane, one can use [2]. The shadertoy example can be found here. A simplified version showing wn=2 are rendered as outside:// The MIT License // Copyright © 2019 Inigo Quilez // Distance to a regular pentagon, without trigonometric functions. float dot2( in vec2 v ) { return dot(v,v); } float cross2d( in vec2 v0, in vec2 v1) { return v0.xv1.y - v0.yv1.x; } const int N = 9;float sdPoly( in vec2[N] v, in vec2 p ) { const int num = v.length(); float d = dot(p-v[0],p-v[0]); float s = 1.0; for( int i=0, j=num-1; i<num; j=i, i++ ) { // distance vec2 e = v[j] - v[i]; vec2 w = p - v[i]; vec2 b = w - e*clamp( dot(w,e)/dot(e,e), 0.0, 1.0 ); d = min( d, dot(b,b) ); // winding number from http://geomalgorithms.com/a03-_inclusion.html bvec3 cond = bvec3( p.y>=v[i].y, p.y<v[j].y, e.x*w.y>e.y*w.x ); // e.x / e.y > w.x / w.y if( all(cond) || all(not(cond)) ) s *= -1.0; } return s * sqrt(d); } void mainImage( out vec4 fragColor, in vec2 fragCoord ) { vec2 p = (2.0*fragCoord-iResolution.xy)/iResolution.y; vec2 v0 = 0.8*cos( 0.40*iTime + vec2(0.0,2.00) + 0.0 ); vec2 v1 = 0.8*cos( 0.45*iTime + vec2(0.0,1.50) + 1.0 ); vec2 v2 = 0.8*cos( 0.50*iTime + vec2(0.0,3.00) + 2.0 ); vec2 v3 = 0.8*cos( 0.55*iTime + vec2(0.0,2.00) + 4.0 ); vec2 v4 = 0.8*cos( 0.60*iTime + vec2(0.0,1.00) + 5.0 ); vec2 v5 = 0.8*cos( 0.45*iTime + vec2(0.0,1.50) + 6.0 ); vec2 v6 = 0.8*cos( 0.50*iTime + vec2(0.0,3.00) + 7.0 ); vec2 v7 = 0.8*cos( 0.55*iTime + vec2(0.0,2.00) + 8.0 ); vec2 v8 = 0.8*cos( 0.60*iTime + vec2(0.0,1.00) + 9.0 ); // add more points vec2[] poly = vec2[](v0,v1,v2,v3,v4,v5,v6,v7,v8); float d = sdPoly(poly, p ); vec3 col = vec3(0.); col = mix( col, vec3(1.0), 1.0-smoothstep(0.0,0.015,abs(d)) ); fragColor = vec4(col,1.0); if (d < 0.) fragColor.b = .4; } </pre>
[3] discussion on stackoverflow: Get vertex positions in fragment shader
Parameters:
Name | Type | Description |
---|---|---|
vparas |
object | Visual.paras |
Returns:
{fragmentShader, vertexShader}
- Type
- object