We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.

Lighting Issue on Android – Augmented reality

Home Forums Graphics / Blender Lighting Issue on Android – Augmented reality

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #81846
    gestion
    Customer

    Hello,

    When I export my object in blender into Verge3d .gltf, the object appear very dark in augmented reality when using android (ok with IOS).
    On the other hand, when I manually export it in .glb and watch it in augmented reality with a different method, the lighting is perfect.

    So I assume that Verge3d compress somehow the information for the lighting.
    How can I avoid this compressions ? (if this is the case).

    I had a similar issue with a texture resolution that has been solved here.

    PS : I use google scene viewer to see the object augmented reality since WebXr has compatibility issue with Samsung.

    #81864

    Hi,
    Perhaps, environment lighting is different. HDR maps are supported only with Verge3D, while other engines might provide their own built-in lighting environment.

    Soft8Soft Tech Chief
    X | FB | LinkedIn

    #82052
    gestion
    Customer

    I don’t have any HDRI or lights. So when I export it to Verge3d .gltf or glTF 2.0 both have the same ligting parameters (none).

    When I open them using google viewer (as I said the WebWR is not compatible with most Samsung phones), the Objets that has been exported with glTF 2.0 works fine.
    I have an issue with the object that has been exported with Verge3d though.

    In the reflection of the sphere, we can see that they both share the same built-in environment.
    I have also reduced to the max the size of the object to be sure it is not the issue.

    I have also applied scale.

    I have tried to check and uncheked the gltf 2.0 compatibility in Verge3d settings. No effect.

    #82056
    xeon
    Customer

    As you have determined there are difference between iOS and Android with respect to their AR viewers, but there are more differences than just the viewer. The viewers are not the same in any way. iOS uses the lighting based on the room the iOS device is in to simulate light on your 3D object. It also uses a unique color space for iOS devices…making those of us that require accurate colors to adjust textures to an optimized lighting space for testing knowing the colors will change based on room lighting. Android uses the lights sources in the V3D scene but has yet a different color space..requiring another set of textures and materials if you need to maintain color accuracy. Note the color spaces for iOS and Android are drastically different so we typically have different materials for iOS and Android. Android does not use room lighting or the environment to light your object…you will need to setup your own lighting…vs iOS which will ignore your lighting. Given lights have a drastic effect on performance…it’s often better to have separate apps for both platforms to maximize efficiency since lighting, color space / materails are different and you may have to perform model optimization for one platform or the other to account for lights.

    • This reply was modified 1 month ago by xeon.
    • This reply was modified 1 month ago by xeon.
    • This reply was modified 1 month ago by xeon.

    Xeon
    Route 66 Digital
    Interactive Solutions - https://www.r66d.com
    Tutorials - https://www.xeons3dlab.com

    #82097
    gestion
    Customer

    Hi Xeon,

    I agree with the differences between iOS and Android that you mentioned.
    The built-in lighting for AR on iOS is great, and I’ve never had any issues with it.
    The USDZ works fine. My issue is limited to Android use.

    In my case, I use Verge3D only to configure the product and export the GLB file. I don’t use any lighting at all in my scene (not even HDRI).

    After finishing the configuration, I run a custom script to extract the base64-encoded GLB in order to send it to the Google Viewer for AR. In this scenario, I don’t rely on WebXR at all—I rely solely on the Google Viewer’s lighting environment.

    What I’ve noticed is that for simple, small objects, the lighting looks great in the Google Viewer when using export to Verge3D’s GLTF. However, when the object becomes more complex (see my example), the lighting quality noticeably drops.
    On the other hand, if I export the same object directly from Blender as a GLB, the lighting is consistently good.

    So, my original question was: is there anything in Verge3D’s GLTF export process that could impact the quality of lighting when exporting a Blender object as GLB (compared to exporting directly in GLB 2.0 from Blender)?
    (By the way, I’ve tried the GLB 2.0 compatibility setting—it made no difference.)

    #82121

    Hi,
    Try to export with “glTF compatible” enabled from Blender first, then export this scene using the puzzle.

    Basically, the puzzle seeks to reduce the complexity of the node-based materials before export, that’s why you might see the different results. If you load your scene with “glTF-compatible” materials first, this process should be less destructive for your shaders.

    Also, you might load your assets back to Blender to see if anything changes, possibly making proper correction in the source material as it goes.

    Soft8Soft Tech Chief
    X | FB | LinkedIn

Viewing 6 posts - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.