We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.

Testing recommendations – how do you test your Verge3D app?

Home Forums General Questions Testing recommendations – how do you test your Verge3D app?

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #53501
    Pascal
    Customer

    Hej fellows,

    let us come together in this thread and discuss the strategies of testing a Verge3D-app.

    I have a background in web-development and I remember the good old days, when a simple HTML-page with some CSS needed testing on various browsers in plenty versions, and platforms, just to find out that every hack, that solves an unwanted effect in one testcase, leads to another unwanted effect in another browser, version or platform.
    Luckily, the time of browser-wars and none-standard-behaviour of browsers is over. We had a nice period of web-standard-adoption on most platforms and browsers. Today, a simple HTML and CSS webpage renders more or less identically. (Yes, I know, not really.. but you get the idea…)

    Today, we as WebGL-enthusiasts, are facing a comeback of the time-wasting and, in my case, uncoordinated process of testing, testing and even more testing.

    The main categories that need to be tested are:

    • Browsers in different versions
    • OS and flavors and settings
    • GFX-Hardware and drivers

    I feel like this testing-szenarios get even more complex than it was with web-development in the late 90s. :cry:

    I would like to collect the individual voices and ideas and experiences of Verge3D users/developers here in this thread.

    What needs to be tested and how?

    Maybe that leads to some kind of matrix, where we all can look up which combination of browsers, OS and hardware work and where are the individual limits of each setup…?

    #53503
    Pascal
    Customer

    First step might be to write down an overview (table) of platforms and OS and browsers and versions that could/should be part of a comprehensive test-matrix.

    I don’t trust any browser-statistic, but there is no better option I can think of right now:
    https://de.statista.com/statistik/daten/studie/436243/umfrage/meistgenutzte-browser-im-internet-in-deutschland/

    Anybody knows valid data source on OS and tablet distribution?

    #53507
    xeon
    Customer

    This is a topic of much concern when building apps for commercial use.
    The following is how we go about it from a general perspective.

    Overview
    1. Business Requirements Document – specified the required devices, operating systems, and browsers. There are so many variations in the world testing them all is just not a cost-effective approach. However, clients tend to have a good idea of the usage stats from each of these based on their existing websites. This needs to be defined and locked in at the start of development.

    2. Functional specifications – all the functions needs to be mapped out. Basically what button does what. All the logic needs to be there so you know what to test and what the expected outcome is. This too needs to be locked in at the start of development.

    3. Test scripts – everything button click, every button click and flow of logic needs to be in a document that someone can run through and validate if the application is running correctly.

    4. Depending on the critical nature of the application a Failure Mode and Effects Analysis and Control Plan should be created to keep track of how bugs are mitigated and what controls are put in place to thwart them. Most importantly this information needs to get fed back into the test script so that the script can test vulnerable / critical areas effectively.

    5. Test method, browsers and OS need to be defined up front in your SOW / agreement with the client. As we all know long developments can span several releases of OS, browser, etc. These changes have the potential of severe impact on development. Your SOW/agreement should outline how those particular changes are handled and or excluded.

    Virtual Testing – depending on your security / liability of a given app virtual testing can be done through a number of online providers. You can write scripts for automated testing or grind through test scripts manually.

    Physical Testing – Typically the devices specified in the business requirements documentation and the SOW/agreement will either be a part of your existing testing equipment or will need to be provided by the client or purchased. There is no easy method. Wipe the device, reset to factory settings and test per test script.

    MANY VARIATIONS:
    There are many OS, browsers and device combinations that it can make the number of tests quite large. In some cases, you can get your client to agree to one very large test at launch and then subsequent tests are smoke tests or limited device/os/browser configurations that hit an agreed-upon percentage of the intended customers.

    AR/VR – This adds additional complexity due to the wide variety of headsets and iOS causing a fork in code. We have not found an easy way around physical testing on the specified and agreed-upon devices.

    CLIENT TESTING REQUIREMENTS/ YOUR LIABILITY / INSURANCE
    Depending on the client they may require your software to go through their own testing and approval process prior to being integrated. Others are more relaxed and don’t even really think about testing until they find out something is not working. This later brings up a need for the Developer to make sure they get sign-off at the time of completion that the application has passed their inspection and is free from defects. Without this the client will continually come back with issues that have to be tracked down as new versions of OS, browser, devices, etc come out. Most larger projects require the developer to carry insurance as well as carry liability for losses related to their code. A good test program and client approval can mitigate liability if the issue got to court but nothing beats good liability insurance.

    Internal testing / Demos, etc.
    When we do an internal project we take a different approach. We are the customer and we are not going to take ourselves to court.

    Testing platforms:
    Chrome – desktop (latest Windows / latetest OSX)
    Safari – OSX latest
    Android – Samsung Galaxy latest that we have on hand, Pixel 6 Pro
    iPhone – iPhone XS Max with the latest OSX
    iPad – iPad Pro (the older the better) with the latest OSX

    Step 1: Just test on desktop Chrome until that works well. This testing is done during development with an occasion test on iOS for validation that materials, lights are matching up nicely. Spefific testing on desktop chrome and iOS always as it relates to shape keys/ morph targets.

    Step 2: Final export of GLTF. We open the gltf to do a material and lighting verification in the various combinations of OS, browser and device. Often times we find we have to go back and create material specific for a device for them to look as intended.

    Step 3. Puzzles. Program the puzzles as intended with spot-checking on desktop and Chrome only.

    Step 4. HTML / Javascript – test as needed in Chrome using dev tools to emulate devices to insure responsiveness.

    Step 5. Data integration – during development test as needed browser/device-independent just need to validate it works on the devices

    Step 6. Analytics integration – validation using Chrome will be tested in full on other devices later.

    Step 7. FULL TEST VALIDATIONS –
    Starts with Chrome on the desktop. Every aspect of the build is tested and validated. If any issues… back to dev and restart process. If it passes, move to Safari on OSX. This is a visual as well as a functional test. If it fails…back to dev and restart testing. If it passes…move on to iOS devices and repeat the process through all the devices and browsers.

    ON GOING TESTING:
    As many of us know iOS 15.4 caused many apps to stop functioning. We always recommend a maintenance plan with all Verge3d apps so you can test the older projects against newer OS, browsers and devices and can quote fixes.

    GFX Hardware – it will be up to the developer to decide if they are going to get this granular in their delivery requirements. For applications that are specific to a piece of hardware such as at an event or tradeshow you will definitely want to specify the hardware prior to development and even have that specific hardware on hand during development to test on. On the other hand we do not recommend testing individual graphics hardware for web applications as the number of permutations becomes impossible to test.

    My two cents…hope to hear how others are doing it.

    Xeon
    Route 66 Digital
    Interactive Solutions - https://www.r66d.com
    Tutorials - https://www.xeons3dlab.com

    #53534
    Pascal
    Customer

    My two cents…

    :scratch:

    Well, that is…hmm.. I would say that is more than a good start. I definitely need some time to assimilate the information.

Viewing 4 posts - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.