提交测试
3
tests/integration/experimental/dash3d/.gitignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
cypress/test_output
|
||||
cypress/videos
|
||||
cypress/screenshots
|
||||
101
tests/integration/experimental/dash3d/README.md
Normal file
@@ -0,0 +1,101 @@
|
||||
# Dash3D Integration Testing (beta)
|
||||
|
||||
Because Dash3d is still in experimental stage, we opt for only
|
||||
high-level integration testing at this point.
|
||||
|
||||
## Dependencies
|
||||
|
||||
To run tests, install [node js](https://nodejs.org/):
|
||||
```
|
||||
conda install -c conda-forge nodejs
|
||||
```
|
||||
|
||||
Make sure to install system dependencies required by
|
||||
[Cypress](https://docs.cypress.io/guides/getting-started/installing-cypress.html#System-requirements).
|
||||
|
||||
Most front end dependencies are managed by [npm](https://www.npmjs.com/),
|
||||
automatically installed with node. To install front end dependencies, run
|
||||
the following from the **root of kaolin**:
|
||||
```
|
||||
npm install
|
||||
```
|
||||
|
||||
## How to run all tests
|
||||
|
||||
All integration tests are wrapped in python. To run all tests
|
||||
from the root of kaolin:
|
||||
```
|
||||
pytest --capture=tee-sys tests/integration/experimental/dash3d/
|
||||
```
|
||||
|
||||
## Under the Hood
|
||||
|
||||
Currently, the only javascript tests are integration, and we call
|
||||
the javascript test from a python test file. In the future, this
|
||||
may change.
|
||||
|
||||
#### Mocha Tests
|
||||
|
||||
Javascript tests that can run *outside of the browser* are
|
||||
implemented using [Mocha](https://mochajs.org/). Note that
|
||||
currently, **all js tests are called from python**. To run
|
||||
Mocha tests manually, run:
|
||||
```
|
||||
npx mocha "./tests/integration/experimental/dash3d/*.js"
|
||||
```
|
||||
*Note:* his will fail, because it expects
|
||||
data generated by python wrapper tests.
|
||||
|
||||
Failing mocha tests can be debugged in Chrome by running:
|
||||
```
|
||||
./node_modules/mocha/bin/mocha --inspect --inspect-brk path/to/test.js
|
||||
```
|
||||
Then, in Chrome navigate to [chrome://inspect/](chrome://inspect/) and
|
||||
click "Open dedicated DevTools for Node". You may need to manually add
|
||||
the test and `static` sibdirectories under sources.
|
||||
|
||||
#### Cypress Tests
|
||||
|
||||
End-to-end tests that *require a browser* are implemented
|
||||
using [Cypress](https://www.cypress.io/). Note that currently, **all cypress
|
||||
tests are called from python**, but the tests themselves
|
||||
are written in javascript and located in `tests/integration/experimental/dash3d/cypress/integration/`. It is essential to be able
|
||||
to run them manually for debugging.
|
||||
|
||||
First run a python test that spins up a dash3d instance in the
|
||||
background (note that multiple invokations of this may require you
|
||||
to set `--skip_start_dash3d` at the end of the command in case
|
||||
dash3d is already running):
|
||||
```
|
||||
python -m tests.integration.experimental.dash3d.run_e2e_test
|
||||
```
|
||||
This test also runs cypress tests, but in case they fail it's useful to invoke cypress manually.
|
||||
|
||||
To open cypress UI:
|
||||
```
|
||||
npx cypress open --config-file tests/integration/experimental/dash3d/cypress.json
|
||||
```
|
||||
|
||||
Alternatively, run cypress tests automatically (this is called from `run_e2e_test`):
|
||||
```
|
||||
npx cypress run --config-file tests/integration/experimental/dash3d/cypress.json
|
||||
```
|
||||
|
||||
#### Debugging Cypress Tests
|
||||
|
||||
Cypress writes a lot of diagnostic information during testing. Opening debug console in the browser during test execution is helpful. Also, check
|
||||
out the following directories:
|
||||
* screenshots: `tests/integration/experimental/dash3d/cypress/screenshots/`
|
||||
* videos: `tests/integration/experimental/dash3d/cypress/videos/`
|
||||
* renderings from tests: `tests/integration/experimental/dash3d/cypress/test_output/`
|
||||
|
||||
Most of the tests perform visual regression to ensure that the right
|
||||
geometry is passed from the server to the client. As a consequence,
|
||||
changes to rendering properties will break the test and require
|
||||
change to golden files. The `test_output` directory will contain
|
||||
the updated golden files.
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
11
tests/integration/experimental/dash3d/cypress.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"fileServerFolder": "tests/integration/experimental/dash3d/cypress",
|
||||
"componentFolder" : "tests/integration/experimental/dash3d/cypress/component",
|
||||
"downloadsFolder" : "tests/integration/experimental/dash3d/cypress/downloads",
|
||||
"fixturesFolder": "tests/integration/experimental/dash3d/cypress/fixtures",
|
||||
"integrationFolder": "tests/integration/experimental/dash3d/cypress/integration",
|
||||
"pluginsFile": "tests/integration/experimental/dash3d/cypress/plugins/index.js",
|
||||
"screenshotsFolder": "tests/integration/experimental/dash3d/cypress/screenshots",
|
||||
"supportFile": "tests/integration/experimental/dash3d/cypress/support/index.js",
|
||||
"videosFolder": "tests/integration/experimental/dash3d/cypress/videos"
|
||||
}
|
||||
|
After Width: | Height: | Size: 14 KiB |
|
After Width: | Height: | Size: 12 KiB |
|
After Width: | Height: | Size: 14 KiB |
|
After Width: | Height: | Size: 21 KiB |
|
After Width: | Height: | Size: 12 KiB |
|
After Width: | Height: | Size: 18 KiB |
|
After Width: | Height: | Size: 29 KiB |
|
After Width: | Height: | Size: 29 KiB |
|
After Width: | Height: | Size: 29 KiB |
|
After Width: | Height: | Size: 30 KiB |
|
After Width: | Height: | Size: 29 KiB |
|
After Width: | Height: | Size: 30 KiB |
@@ -0,0 +1,129 @@
|
||||
// Copyright (c) 2019-2020, NVIDIA CORPORATION. All rights reserved.
|
||||
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
const assert = require('assert');
|
||||
|
||||
|
||||
const TYPES_TO_TEST = ['mesh', 'pointcloud'];
|
||||
const NVIEWS = 2;
|
||||
|
||||
// This tests the renderings in the viewports against ground truth images
|
||||
describe('Visual Regression', () => {
|
||||
beforeEach(function() {
|
||||
// To update these can use one of 2 ways:
|
||||
// 1. Look at test output saved in DEBUG_FOLDER
|
||||
// 2. Load dash3d localhost:8008 and run commands like this in console:
|
||||
// nvidia.util.downloadURL('mesh0.png', $("#mesh-view0 canvas")[0].toDataURL())
|
||||
|
||||
// Initial renderings
|
||||
cy.fixture('images/mesh_gt_id0.png').as('mesh0_data'); // fixture data names
|
||||
cy.fixture('images/mesh_output_id0_final.png').as('mesh1_data');
|
||||
cy.fixture('images/pointcloud_input_id0.png').as('pointcloud0_data');
|
||||
cy.fixture('images/pointcloud_output_id0_final.png').as('pointcloud1_data');
|
||||
|
||||
|
||||
// Specific renderings (caused by user input)
|
||||
cy.fixture('images/mesh_gt_id0.png').as('mesh_ground_truth_id0');
|
||||
cy.fixture('images/mesh_gt_id1.png').as('mesh_ground_truth_id1');
|
||||
cy.fixture('images/mesh_output_id0_final.png').as('mesh_output_id0'); // last it
|
||||
cy.fixture('images/mesh_output_id0_it50.png').as('mesh_output_id0_it50');
|
||||
cy.fixture('images/mesh_output_id1_final.png').as('mesh_output_id1'); // last it
|
||||
cy.fixture('images/mesh_output_id1_it50.png').as('mesh_output_id1_it50');
|
||||
cy.fixture('images/pointcloud_input_id0.png').as('pointcloud_input_id0');
|
||||
cy.fixture('images/pointcloud_input_id1.png').as('pointcloud_input_id1');
|
||||
cy.fixture('images/pointcloud_output_id0_final.png').as('pointcloud_output_id0'); // last it
|
||||
cy.fixture('images/pointcloud_output_id0_it50.png').as('pointcloud_output_id0_it50');
|
||||
cy.fixture('images/pointcloud_output_id1_final.png').as('pointcloud_output_id1'); // last it
|
||||
cy.fixture('images/pointcloud_output_id1_it50.png').as('pointcloud_output_id1_it50');
|
||||
})
|
||||
it('Initial Page Rendering', () => {
|
||||
cy.visit('http://localhost:8008/');
|
||||
|
||||
// Note: this part depends on the initial rendering, which may change
|
||||
cy.wait(2000).then(() => {
|
||||
cy.wrap(TYPES_TO_TEST).each((tname) => {
|
||||
cy.wrap([0, 1]).each((v) => {
|
||||
// e.g. '#mesh-view0 canvas'
|
||||
var view_selector = '#' + tname + '-view' + v + ' canvas';
|
||||
var data_name = '@' + tname + v + '_data'; // fixture data name
|
||||
cy.checkCanvasRendering(view_selector, data_name, 'test_initial_render');
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
it('Setting Category and ID', () => {
|
||||
cy.visit('http://localhost:8008/');
|
||||
|
||||
// Select the right id and category and test that we can load
|
||||
// requested geometry in every viewport
|
||||
var cats_per_type = { 'mesh': ['ground_truth', 'output'],
|
||||
'pointcloud': ['input', 'output'] };
|
||||
cy.wait(2000).then(() => {
|
||||
cy.wrap(TYPES_TO_TEST).each((tname) => {
|
||||
cy.wrap([0, 1]).each((view_id) => {
|
||||
cy.wrap(cats_per_type[tname]).each((cat_name) => {
|
||||
cy.wrap([0, 1]).each((mesh_id) => {
|
||||
// e.g. '#mesh-view0 canvas'
|
||||
var view_selector = '#' + tname + '-view' + view_id + ' canvas';
|
||||
var category_selector = '#' + tname + '-header' + view_id + ' select.cat';
|
||||
var id_selector = '#' + tname + '-header' + view_id + ' select.id';
|
||||
var data_name = '@' + tname + '_' + cat_name + '_id' + mesh_id;
|
||||
// Set category and id in the viewport
|
||||
cy.get(id_selector).select('id ' + mesh_id).then(() => {
|
||||
cy.get(category_selector).select(cat_name).wait(1000).then(() => {
|
||||
console.log('Set category ' + cat_name + ' and id ' + mesh_id);
|
||||
// Check rendering
|
||||
cy.checkCanvasRendering(view_selector, data_name, 'test_set_category_and_id');
|
||||
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
it('Setting Global Iteration Number', () => {
|
||||
cy.visit('http://localhost:8008/');
|
||||
|
||||
cy.get('#mesh-header0 select.cat').select('output').then(() => {
|
||||
cy.get('#mesh-header0 select.id').select('id 0').then(() => {
|
||||
cy.get('#mesh-header1 select.cat').select('ground_truth').then(() => {
|
||||
cy.get('#mesh-header1 select.id').select('id 0').then(() => {
|
||||
cy.get('#pointcloud-header0 select.cat').select('output').then(() => {
|
||||
cy.get('#pointcloud-header0 select.id').select('id 0').then(() => {
|
||||
cy.get('#pointcloud-header1 select.cat').select('input').then(() => {
|
||||
cy.get('#pointcloud-header1 select.id').select('id 0').then(() => {
|
||||
cy.get('#timeslider').invoke('val', 50).trigger('change').wait(1000).then(() => {
|
||||
let test_subfolder = 'test_set_its';
|
||||
cy.checkCanvasRendering(
|
||||
'#mesh-view0 canvas', '@mesh_output_id0_it50', test_subfolder);
|
||||
cy.checkCanvasRendering(
|
||||
'#mesh-view1 canvas', '@mesh_ground_truth_id0', test_subfolder);
|
||||
cy.checkCanvasRendering(
|
||||
'#pointcloud-view0 canvas', '@pointcloud_output_id0_it50', test_subfolder);
|
||||
cy.checkCanvasRendering(
|
||||
'#pointcloud-view1 canvas', '@pointcloud_input_id0', test_subfolder);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
})
|
||||
@@ -0,0 +1,21 @@
|
||||
/// <reference types="cypress" />
|
||||
// ***********************************************************
|
||||
// This example plugins/index.js can be used to load plugins
|
||||
//
|
||||
// You can change the location of this file or turn off loading
|
||||
// the plugins file with the 'pluginsFile' configuration option.
|
||||
//
|
||||
// You can read more here:
|
||||
// https://on.cypress.io/plugins-guide
|
||||
// ***********************************************************
|
||||
|
||||
// This function is called when a project is opened or re-opened (e.g. due to
|
||||
// the project's config changing)
|
||||
|
||||
/**
|
||||
* @type {Cypress.PluginConfig}
|
||||
*/
|
||||
module.exports = (on, config) => {
|
||||
// `on` is used to hook into various events Cypress emits
|
||||
// `config` is the resolved Cypress config
|
||||
}
|
||||
@@ -0,0 +1,56 @@
|
||||
// Copyright (c) 2019-2020, NVIDIA CORPORATION. All rights reserved.
|
||||
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
const IMG_WIDTH = 300;
|
||||
const MAX_DIFFERING_PIXELS = Math.floor(IMG_WIDTH * IMG_WIDTH * 0.02);
|
||||
const DEBUG_FOLDER = 'tests/integration/experimental/dash3d/cypress/test_output/';
|
||||
|
||||
Cypress.Commands.add('checkCanvasRendering', (view_selector, data_name, testsubfolder) => {
|
||||
cy.window().then((win) => {
|
||||
expect(cy.get(view_selector));
|
||||
cy.get(view_selector)
|
||||
.then(($el) => {
|
||||
return win.nvidia.test.convertDataUrl($el.get(0).toDataURL(), IMG_WIDTH);
|
||||
})
|
||||
.then((actual) => {
|
||||
cy.get(data_name)
|
||||
.then((img_data) => {
|
||||
return win.nvidia.test.convertDataUrl('data:image/png;base64,' + img_data, IMG_WIDTH);
|
||||
})
|
||||
.then((expected) => {
|
||||
console.log('Actual: ');
|
||||
console.log(actual);
|
||||
console.log('Expected: ');
|
||||
console.log(expected);
|
||||
let cmpare = win.nvidia.test.getImageDiff(expected[0], actual[0]);
|
||||
console.log(cmpare);
|
||||
let fprefix = DEBUG_FOLDER + testsubfolder;
|
||||
cy.writeFile(fprefix + '/expected/' + data_name.slice(1) + '_expected.png',
|
||||
win.nvidia.test.stripBase64Marker(expected[1]), 'base64')
|
||||
.then(() => {
|
||||
cy.writeFile(fprefix + '/actual/' + data_name.slice(1) + '.png',
|
||||
win.nvidia.test.stripBase64Marker(actual[1]), 'base64');
|
||||
})
|
||||
.then(() => {
|
||||
cy.writeFile(fprefix + '/expected/' + data_name.slice(1) + '_diff.png',
|
||||
win.nvidia.test.stripBase64Marker(
|
||||
win.nvidia.test.imageDataToDataUrl(cmpare[0])), 'base64');
|
||||
})
|
||||
.then(() => {
|
||||
expect(cmpare[1]).to.be.lessThan(MAX_DIFFERING_PIXELS);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,20 @@
|
||||
// ***********************************************************
|
||||
// This example support/index.js is processed and
|
||||
// loaded automatically before your test files.
|
||||
//
|
||||
// This is a great place to put global configuration and
|
||||
// behavior that modifies Cypress.
|
||||
//
|
||||
// You can change the location of this file or turn off
|
||||
// automatically serving support files with the
|
||||
// 'supportFile' configuration option.
|
||||
//
|
||||
// You can read more here:
|
||||
// https://on.cypress.io/configuration
|
||||
// ***********************************************************
|
||||
|
||||
// Import commands.js using ES2015 syntax:
|
||||
import './commands'
|
||||
|
||||
// Alternatively you can use CommonJS syntax:
|
||||
// require('./commands')
|
||||
132
tests/integration/experimental/dash3d/run_e2e_test.py
Normal file
@@ -0,0 +1,132 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Copyright (c) 2019-2020, NVIDIA CORPORATION. All rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
THIS_DIR = os.path.dirname(os.path.realpath(__file__))
|
||||
KAOLIN_ROOT = os.path.realpath(
|
||||
os.path.join(THIS_DIR, os.pardir, os.pardir, os.pardir, os.pardir))
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def obj_paths():
|
||||
samples_dir = os.path.join(KAOLIN_ROOT, 'tests', 'samples')
|
||||
return [os.path.join(samples_dir, 'rocket.obj'),
|
||||
os.path.join(samples_dir, 'model.obj')]
|
||||
|
||||
|
||||
def timelapse_path():
|
||||
return os.path.realpath(
|
||||
os.path.join(KAOLIN_ROOT, 'tests', 'samples', 'timelapse', 'notexture'))
|
||||
|
||||
|
||||
def golden_screenshots_path():
|
||||
return os.path.join(THIS_DIR, 'cypress', 'fixtures')
|
||||
|
||||
|
||||
def cypress_config_path():
|
||||
# Important: must be relative
|
||||
return os.path.join('tests', 'integration', 'experimental', 'dash3d', 'cypress.json')
|
||||
|
||||
|
||||
def port():
|
||||
return 8008
|
||||
|
||||
|
||||
def generate_timelapse_input():
|
||||
objs = ','.join(obj_paths())
|
||||
out_dir = timelapse_path()
|
||||
script = os.path.realpath(
|
||||
os.path.join(KAOLIN_ROOT, 'examples', 'tutorial', 'visualize_main.py'))
|
||||
|
||||
args = f'--skip_normalization --test_objs={objs} --output_dir={out_dir}'
|
||||
command = f'python {script} {args}'
|
||||
logger.info(f'Re-generating timelapse input here: {out_dir}\n by running {command}')
|
||||
if os.path.exists(out_dir):
|
||||
shutil.rmtree(out_dir)
|
||||
os.makedirs(out_dir, exist_ok=True)
|
||||
ret = os.system(command)
|
||||
if ret != 0:
|
||||
raise RuntimeError('Creation of timelapse failed')
|
||||
|
||||
|
||||
def start_dash3d():
|
||||
script = os.path.realpath(os.path.join(THIS_DIR, 'start_dash3d.sh'))
|
||||
logdir = timelapse_path()
|
||||
_port = port()
|
||||
|
||||
command = f'{script} {logdir} {_port}'
|
||||
logger.info(f'Starting dash3d server in the background by running {command}')
|
||||
ret = os.system(command)
|
||||
|
||||
if ret != 0:
|
||||
raise RuntimeError('Failed to start Dash3D')
|
||||
|
||||
|
||||
def run_cypress():
|
||||
command = 'npx cypress run --config-file {}'.format(cypress_config_path())
|
||||
logger.info(f'Starting cypress by running {command}')
|
||||
os.chdir(KAOLIN_ROOT)
|
||||
ret = os.system(command)
|
||||
if ret != 0:
|
||||
raise RuntimeError('Failed cypress integration test')
|
||||
|
||||
|
||||
def run_end_to_end_integration_tests():
|
||||
print('END 2 END INTEGRATION TEST FOR DASH 3D-------------------------------')
|
||||
print('Timelapse input: {}'.format(timelapse_path()))
|
||||
print('Server: http://localhost:{}'.format(port))
|
||||
print('Golden screenshot files: {}'.format(golden_screenshots_path()))
|
||||
print('Visual comparison results: ')
|
||||
|
||||
|
||||
def run_main(regenerate_timelapse_input,
|
||||
skip_start_dash3d):
|
||||
logging.basicConfig(level=logging.DEBUG,
|
||||
format='%(asctime)s|%(levelname)8s|%(name)15s| %(message)s',
|
||||
handlers=[logging.StreamHandler(sys.stdout)])
|
||||
|
||||
if regenerate_timelapse_input:
|
||||
generate_timelapse_input()
|
||||
|
||||
if not skip_start_dash3d:
|
||||
start_dash3d()
|
||||
|
||||
run_cypress()
|
||||
|
||||
|
||||
class TestBinaryEncoding:
|
||||
def test_server_client_binary_compatibility(self):
|
||||
run_main(regenerate_timelapse_input=False,
|
||||
skip_start_dash3d=False)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
aparser = argparse.ArgumentParser()
|
||||
aparser.add_argument('--regenerate_timelapse_input', action='store_true',
|
||||
help='If set, will regenerate timelapse input in {}')
|
||||
aparser.add_argument('--skip_start_dash3d', action='store_true',
|
||||
help='If set, will skip starting dash3d, which may already be running.')
|
||||
args = aparser.parse_args()
|
||||
|
||||
run_main(regenerate_timelapse_input=args.regenerate_timelapse_input,
|
||||
skip_start_dash3d=args.skip_start_dash3d)
|
||||
46
tests/integration/experimental/dash3d/start_dash3d.sh
Executable file
@@ -0,0 +1,46 @@
|
||||
#!/bin/bash -e
|
||||
set -o nounset
|
||||
|
||||
# Note: when run as a subprocess something is setting this
|
||||
# variable, which causes issues; printing for debug information
|
||||
# and unsetting
|
||||
if [[ -v MKL_THREADING_LAYER ]];
|
||||
then
|
||||
echo "Unsetting MKL_THREADING_LAYER=$MKL_THREADING_LAYER"
|
||||
unset MKL_THREADING_LAYER
|
||||
fi
|
||||
|
||||
# Get the directory where current script is located
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
KAOLIN_ROOT=$SCRIPT_DIR/../../../..
|
||||
|
||||
DASH3D=kaolin-dash3d
|
||||
|
||||
USAGE="$0 [log_directory] (optional: port)
|
||||
|
||||
Runs dash3d in the background using script:
|
||||
$DASH3D
|
||||
"
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "$USAGE"
|
||||
exit
|
||||
fi
|
||||
|
||||
FLAGS="--logdir=$1 --log_level=10" # DEBUG
|
||||
if [ $# -gt 1 ]; then
|
||||
FLAGS="$FLAGS --port=$2"
|
||||
fi
|
||||
|
||||
echo "Running Dash3D in the background using command: "
|
||||
echo "$DASH3D $FLAGS"
|
||||
|
||||
$DASH3D $FLAGS &
|
||||
PID=$!
|
||||
|
||||
sleep 2
|
||||
set +e
|
||||
kill -0 $PID # Check still runs
|
||||
if [ "$?" -ne "0" ]; then
|
||||
echo "Failed to start dash3d"
|
||||
exit 1
|
||||
fi
|
||||
140
tests/integration/experimental/dash3d/test_binary_parse.js
Normal file
@@ -0,0 +1,140 @@
|
||||
// Copyright (c) 2019-2020, NVIDIA CORPORATION. All rights reserved.
|
||||
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
var assert = require('assert');
|
||||
var THREE = require('three');
|
||||
var fs = require('fs');
|
||||
var path = require('path');
|
||||
|
||||
var geometry = require('../../../../kaolin/experimental/dash3d/src/geometry.js');
|
||||
var util = require('../../../../kaolin/experimental/dash3d/src/util.js');
|
||||
|
||||
var binaries = [];
|
||||
before(function(done){
|
||||
util.set_global_log_level('INFO'); // Comment out to DEBUG if needed
|
||||
|
||||
var paths = ['meshes0_1.bin', 'meshes2.bin', 'clouds0_1.bin', 'clouds2.bin'];
|
||||
for (var i = 0; i < paths.length; ++i) {
|
||||
var p = path.join(__dirname, '_out', paths[i]);
|
||||
util.timed_log('Parsing binary file at path ' + p);
|
||||
var res = fs.readFileSync(p);
|
||||
var res_buffer = new Uint8Array(res).buffer;
|
||||
binaries.push(res_buffer);
|
||||
}
|
||||
done();
|
||||
});
|
||||
|
||||
describe("Binary Mesh Parsing", function() {
|
||||
describe("Reading and checking two meshes from _out/meshes0_1.bin", function() {
|
||||
let geos = null;
|
||||
it('two meshes should be parsed', function() {
|
||||
geos = geometry.BufferedGeometriesFromBinary(binaries[0], 0);
|
||||
assert.equal(geos.length, 2);
|
||||
});
|
||||
it('two meshes should have correct number of vertices and faces', function() {
|
||||
assert.equal(geos[0].getAttribute('position').count, 4);
|
||||
assert.equal(geos[0].getIndex().count, 2 * 3);
|
||||
|
||||
assert.equal(geos[1].getAttribute('position').count, 100);
|
||||
assert.equal(geos[1].getIndex().count, 100 * 3);
|
||||
});
|
||||
it('first mesh should have correct geometry values', function() {
|
||||
let expected_face_idx = [0, 1, 2, 2, 1, 3];
|
||||
for (let i = 0; i < expected_face_idx.length; ++i) {
|
||||
assert.equal(geos[0].getIndex().array[i], expected_face_idx[i],
|
||||
'unexpected face index at ' + i);
|
||||
}
|
||||
let expected_positions = [1.0, 2.0, 3.0,
|
||||
10.0, 20.0, 30.0,
|
||||
2.0, 4.0, 6.0,
|
||||
15.0, 25.0, 35.0];
|
||||
for (let i = 0; i < expected_positions.length; ++i) {
|
||||
assert.equal(geos[0].getAttribute('position').array[i],
|
||||
expected_positions[i],
|
||||
'unexpected position at ' + i);
|
||||
}
|
||||
});
|
||||
it('correct bounding box should be computed for both meshes', function() {
|
||||
let bbox = geometry.GetBoundingBox(geos);
|
||||
assert.equal(bbox.min.x, 0);
|
||||
assert.equal(bbox.min.y, 1);
|
||||
assert.equal(bbox.min.z, 2);
|
||||
assert.equal(bbox.max.x, 297);
|
||||
assert.equal(bbox.max.y, 298);
|
||||
assert.equal(bbox.max.z, 299);
|
||||
});
|
||||
});
|
||||
describe("Reading and checking one mesh from _out/meshes2.bin", function() {
|
||||
let geos = null;
|
||||
it('one mesh should be parsed', function() {
|
||||
geos = geometry.BufferedGeometriesFromBinary(binaries[1], 0);
|
||||
assert.equal(geos.length, 1);
|
||||
});
|
||||
it('one mesh should have correct number of vertices and faces', function() {
|
||||
assert.equal(geos[0].getAttribute('position').count, 3000);
|
||||
assert.equal(geos[0].getIndex().count, 6000 * 3);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("Binary Pointcloud Parsing", function() {
|
||||
describe("Reading and checking two point clouds from _out/clouds0_1.bin", function() {
|
||||
let geos = null;
|
||||
it('two point clouds should be parsed', function() {
|
||||
geos = geometry.PtCloudsFromBinary(binaries[2], 0);
|
||||
assert.equal(geos.length, 2);
|
||||
});
|
||||
it('two point clouds should have correct number of points', function() {
|
||||
assert.equal(geos[0].instanceCount, 4);
|
||||
assert.equal(geos[1].instanceCount, 100);
|
||||
});
|
||||
it('first point cloud should have correct geometry values', function() {
|
||||
let expected_positions = [1.0, 2.0, 3.0,
|
||||
10.0, 20.0, 30.0,
|
||||
2.0, 4.0, 6.0,
|
||||
15.0, 25.0, 35.0];
|
||||
for (let i = 0; i < expected_positions.length; ++i) {
|
||||
assert.equal(geos[0].getAttribute('instanceTranslation').array[i],
|
||||
expected_positions[i],
|
||||
'unexpected position at ' + i);
|
||||
}
|
||||
});
|
||||
it('second point cloud should have correct geometry values', function() {
|
||||
for (let i = 0; i < 300; ++i) {
|
||||
assert.equal(geos[1].getAttribute('instanceTranslation').array[i],
|
||||
i + 0.0,
|
||||
'unexpected position at ' + i);
|
||||
}
|
||||
});
|
||||
it('correct bounding box should be computed for both point clouds', function() {
|
||||
let bbox = geometry.GetBoundingBox(geos);
|
||||
assert.equal(Math.round(bbox.min.x * 1000), 0);
|
||||
assert.equal(Math.round(bbox.min.y * 1000), 1 * 1000);
|
||||
assert.equal(Math.round(bbox.min.z * 1000), 2 * 1000);
|
||||
assert.equal(Math.round(bbox.max.x * 1000), 297 * 1000);
|
||||
assert.equal(Math.round(bbox.max.y * 1000), 298 * 1000);
|
||||
assert.equal(Math.round(bbox.max.z * 1000), 299 * 1000);
|
||||
});
|
||||
});
|
||||
describe("Reading and checking one point cloud from _out/clouds2.bin", function() {
|
||||
let geos = null;
|
||||
it('one point cloud should be parsed', function() {
|
||||
geos = geometry.PtCloudsFromBinary(binaries[3], 0);
|
||||
assert.equal(geos.length, 1);
|
||||
});
|
||||
it('one point cloud should have correct number of points', function() {
|
||||
assert.equal(geos[0].instanceCount, 3000);
|
||||
});
|
||||
});
|
||||
})
|
||||
@@ -0,0 +1,88 @@
|
||||
# Copyright (c) 2019-2020, NVIDIA CORPORATION. All rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import numpy as np
|
||||
import os
|
||||
import pytest
|
||||
import shutil
|
||||
import torch
|
||||
|
||||
import kaolin
|
||||
|
||||
from kaolin.utils.testing import tensor_info
|
||||
|
||||
from kaolin.experimental.dash3d.util import meshes_to_binary
|
||||
from kaolin.experimental.dash3d.util import point_clouds_to_binary
|
||||
|
||||
@pytest.fixture(scope='module')
|
||||
def out_dir():
|
||||
# Create temporary output directory
|
||||
out_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)), '_out')
|
||||
os.makedirs(out_dir, exist_ok=True)
|
||||
yield out_dir
|
||||
shutil.rmtree(out_dir) # Note: comment to keep output directory
|
||||
|
||||
|
||||
@pytest.fixture(scope='module')
|
||||
def meshes():
|
||||
vertices0 = np.array([[1.0, 2.0, 3.0],
|
||||
[10.0, 20.0, 30.0],
|
||||
[2.0, 4.0, 6.0],
|
||||
[15.0, 25.0, 35.0]], dtype=np.float32)
|
||||
faces0 = np.array([[0, 1, 2],
|
||||
[2, 1, 3]], dtype=np.int32)
|
||||
vertices1 = np.arange(0, 300).reshape((-1, 3))
|
||||
faces1 = np.stack([np.arange(0, 100),
|
||||
np.mod(np.arange(0, 100) + 1, 100),
|
||||
np.mod(np.arange(0, 100) + 2, 100)]).astype(np.int32).reshape((-1, 3))
|
||||
vertices2 = np.random.random((1, 9000)).reshape((-1, 3))
|
||||
faces2 = np.stack([np.mod(np.arange(0, 6000), 1000),
|
||||
np.ones((6000,)),
|
||||
np.random.randint(0, 2999 + 1, (6000,))]).astype(np.int32).reshape((-1, 3))
|
||||
return {"faces": [faces0, faces1, faces2],
|
||||
"vertices": [vertices0, vertices1, vertices2]}
|
||||
|
||||
|
||||
@pytest.fixture(scope='module')
|
||||
def pointclouds():
|
||||
pts0 = np.array([[1.0, 2.0, 3.0],
|
||||
[10.0, 20.0, 30.0],
|
||||
[2.0, 4.0, 6.0],
|
||||
[15.0, 25.0, 35.0]], dtype=np.float32)
|
||||
pts1 = np.arange(0, 300).astype(np.float32).reshape((-1, 3))
|
||||
pts2 = np.random.random((1, 9000)).astype(np.float32).reshape((-1, 3))
|
||||
return {"positions": [pts0, pts1, pts2]}
|
||||
|
||||
|
||||
class TestBinaryEncoding:
|
||||
def test_server_client_binary_compatibility(self, meshes, pointclouds, out_dir):
|
||||
# Encode and write mesh0+mesh1 and mesh2 to binary files
|
||||
binstr = meshes_to_binary(meshes['vertices'][0:2], meshes['faces'][0:2])
|
||||
with open(os.path.join(out_dir, 'meshes0_1.bin'), 'wb') as f:
|
||||
f.write(binstr)
|
||||
binstr = meshes_to_binary([meshes['vertices'][2]], [meshes['faces'][2]])
|
||||
with open(os.path.join(out_dir, 'meshes2.bin'), 'wb') as f:
|
||||
f.write(binstr)
|
||||
|
||||
# Encode and write ptcloud0+ptcloud1 and pointcloud2 to binary files
|
||||
binstr = point_clouds_to_binary(pointclouds['positions'][0:2])
|
||||
with open(os.path.join(out_dir, 'clouds0_1.bin'), 'wb') as f:
|
||||
f.write(binstr)
|
||||
binstr = point_clouds_to_binary([pointclouds['positions'][2]])
|
||||
with open(os.path.join(out_dir, 'clouds2.bin'), 'wb') as f:
|
||||
f.write(binstr)
|
||||
|
||||
# Execute javascript test that checks that these are parsed correctly
|
||||
js_test = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'test_binary_parse.js')
|
||||
os.system('npx mocha {}'.format(js_test)) # TODO: will npx work for everyone?
|
||||