Hello LAVA experts,
I need your help for a more complex test we would need to run.
We'd like to test a gateway node that talks to remote sensors.
* the gateway node is a Cortex-A like (E.g.: RPi3) board
* sensors might be a mixture of Cortex-M (non-Linux) and Cortex-A (Linux) devices
* communication between the gateway node and sensors might happen with different protocols: bluetooth, wifi, lora, zigbee.
* besides the gateway node, we need to control every sensor and be able to provision and to power cycle them.
We are pretty confident on running tests on single boards (with and without interaction with LXC containers) but as the above example is more complex we need some guidance on how this can be achieved with LAVA.
The main question mark we have is how LAVA interacts with Cortex-M devices, especially for their deployment.
Does anyone of you have a similar scenario? What are the main challenges? How complicated and/or robust to run something similar with LAVA?
Can you point us to documentation, examples?
Any help is very much appreciated.
Thanks!
--
Diego Russo | Staff Software Engineer | Mbed Linux OS
ARM Ltd. CPC1, Capital Park, Cambridge Road, Fulbourn, CB21 5XE, United Kingdom
http://www.diegor.co.uk - https://os.mbed.com/linux-os/
IMPORTANT NOTICE: The contents of this email and any attachments are confidential and may also be privileged. If you are not the intended recipient, please notify the sender immediately and do not disclose the contents to any other person, use it for any purpose, or store or copy the information in any medium. Thank you.
Hello,
We have some tests that are currently expected to fail, and should then become passes once the implementation of the feature is complete.
Is this something LAVA results can support? As far as I can see the result can be pass/fail/skip (and possibly unknown).
Suggestions welcome.
Thanks.
Pete
IMPORTANT NOTICE: The contents of this email and any attachments are confidential and may also be privileged. If you are not the intended recipient, please notify the sender immediately and do not disclose the contents to any other person, use it for any purpose, or store or copy the information in any medium. Thank you.
Hello,
we are trying to add external interfaces on the worker to connect with the DUTs, for example a 4-port USB-to-RS232 converter. Our DUTs have multiple RS232 ports which shall be tested using this remote interface.
We have already figured out how to integrate this hardware into the LAVA environment, so that it can be used within the LAVA LXC (using static_info in the device dictionary, resulting in the four /dev/ttyUSB* devices being visible there).
First question: We need multiple of these converters attached to the worker. How do we integrate these into LAVA? They all have the same board_id, vendor_id and product_id. If I specify the board_id in the device dictionary multiple times, the device is still added only once.
Second question: We need a way to specify to which of the /dev/ttyUSB* ports a certain RS232 port of the DUT is connected. The place where I would assume to put such information is the device dictionary. But how can we access this information within a LAVA test shell?
The documentation specifies some similar mechanism for energy probes:
https://validation.linaro.org/static/docs/v2/admin-lxc-deploy.html?highligh…
It says "Devices which are not directly attached to the worker can also be supported, for example energy probes which communicate over the network".
As far as I can tell from the code, though, this seems to be a hard-coded feature without any possibility of adding other custom hardware. Is that correct?
If yes, why isn't there a generic mechanism to supply static_info from the device dictionary in the LAVA test shell? Or is there?
How can we implement our scenario described above using LAVA?
Mit freundlichen Grüßen / Best regards
Tim Jaacks
DEVELOPMENT ENGINEER
Garz & Fricke GmbH
Tempowerkring 2
21079 Hamburg
Direct: +49 40 791 899 - 55
Fax: +49 40 791899 - 39
tim.jaacks(a)garz-fricke.com
www.garz-fricke.com
WE MAKE IT YOURS!
Sitz der Gesellschaft: D-21079 Hamburg
Registergericht: Amtsgericht Hamburg, HRB 60514
Geschäftsführer: Matthias Fricke, Manfred Garz, Marc-Michael Braun
After getting stats on my setup robustness, the step forward is have a complete view on the lava errors we meet in incomplete jobs.
>From what I see in incomplete jobs, my intention is to query on test suite lava and the name "job".
In the query builder, if I use test suite as condition model, I can't use the job field name.
Do you have any advice on how to proceed?
Denis
Hi all,
I'm facing a pretty frustrating issue when running CTS/VTS with LAVA.
I'm using Linaro's tradefed test definition :
https://git.linaro.org/qa/test-definitions.git/tree/automated/android/trade…
During some runs, the adb connection is lost, leading to incomplete test
job.
Do you know if this behavior is known and mostly general ? Or is it a bad
configuration on my side ?
Maybe someone knows some way to keep a reliable adb connection to the
target ?
Best regards,
Axel
Sometimes, I can see the lava log feedback to repeat the the lava log output, sometimes cannot.
Could you give an example simple inline test, so I can always see some log feedback?
Something like:
- test:
timeout:
minutes: 1
definitions:
- from: inline
name: smoke-case
path: inline/test.yaml
repository:
metadata:
format: Lava-Test Test Definition 1.0
name: smoke-case-run
description: Run smoke case
run:
steps:
- lava-test-case "Case_001" --shell 'echo "Case001 gstreamer ok";'
How could I let ` Case001 gstreamer ok` also been seen in feedback log level?
Hello,
I have recently upgraded from 2018.11 to 2019.03 and have noticed that the results of a lot of the tests I have been running no longer got parsed correctly by LAVA. This was because I was sending the results using upper case results strings.
Eg. lava-test-case <test-case> PASS opposed to lava-test-case <test-case> pass
This resulted in the following logs in the lava job:
Received signal: <TESTCASE> TEST_CASE_ID=ETH_T_001 RESULT=PASS
Bad test result: PASS
Changing my results parsing script to only send lower case results strings fixed the issue, but was this restriction intended with the upgrade?
Kind Regards,
Patryk
Hello,
I have a system that as soon as LAVA logins, it requires the password to be changed (the password needs to be typed twice).
Is there an easy way to automate it using LAVA?
Cheers
--
Diego Russo | Staff Software Engineer | Mbed Linux OS
ARM Ltd. CPC1, Capital Park, Cambridge Road, Fulbourn, CB21 5XE, United Kingdom
http://www.diegor.co.uk - https://os.mbed.com/linux-os/
IMPORTANT NOTICE: The contents of this email and any attachments are confidential and may also be privileged. If you are not the intended recipient, please notify the sender immediately and do not disclose the contents to any other person, use it for any purpose, or store or copy the information in any medium. Thank you.
Hi all,
I'm currently trying to make a multinode job for CTS on Android 9.
During this job, I need to unlock uboot so I use the "interactive" test
action.
But when I add it to my job definition, I get an error message during the
run :
"Nothing to run. Maybe the 'deploy' stage is missing, otherwise this is a
bug which should be reported."
If I remove the test action with the "interactive" method, I won't get that
error.
Maybe I'm missing something but I don't get what it could be.
You will find attached a test job example which leads to this error.
Best regards,
Axel