Hello,

127.0.0.1:8000 is lava-server-gunicorn.
Do you use apache2 as a reverse proxy?

Have you tried reverting to using "sync"  and restart lava-server-gunicorn?


Rgds

Le ven. 18 sept. 2020 à 05:41, Larry Shen <larry.shen@nxp.com> a écrit :

No, not in the same network.

 

And I check the log again, seems everytime when 502 issue happens, there is something like next:

{"log":"[Tue Sep 15 02:14:58.023302 2020] [proxy:error] [pid 332:tid 139822047946496] (32)Broken pipe: [client 10.192.244.203:54182] AH01084: pass request body failed to 127.0.0.1:8000 (127.0.0.1)\n","stream":"stdout","time":"2020-09-15T02:14:58.319479074Z"}

{"log":"[Tue Sep 15 02:14:58.023352 2020] [proxy_http:error] [pid 332:tid 139822047946496] [client 10.192.244.203:54182] AH01097: pass request body failed to 127.0.0.1:8000 (127.0.0.1) from 10.192.244.203 ()\n","stream":"stdout","time":"2020-09-15T02:14:58.319481826Z"}

 

What is 127.0.0.1:8000? Reverse proxy in lava setup? Any suggestion?

 

From: Remi Duraffort <remi.duraffort@linaro.org>
Sent: Thursday, September 17, 2020 3:37 PM
To: Larry Shen <larry.shen@nxp.com>
Cc: Milosz Wasilewski <milosz.wasilewski@linaro.org>; lava-users@lists.lavasoftware.org
Subject: Re: [Lava-users] [EXT] Re: Issues about XMLRPC & lavacli.

 

Caution: EXT Email

Are you in the same network as the server?

 

Le mer. 16 sept. 2020 à 04:37, Larry Shen <larry.shen@nxp.com> a écrit :

Yes, Remi, we tried it like next:

 

staging:

  token: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

  uri: http://lava-staging.sw.nxp.com/RPC2

  username: larry.shen

  timeout: 300.0

The only result is the error message becomes:

Unable to connect: HTTPConnectionPool(host='lava-master.sw.nxp.com', port=80): Read timed out. (read timeout=300.0)

 

And another guys which directly use XMLRPC to submit without set timeout, will directly get 502, I think the root cause is same. The server is handling connections which cannot response client, the lavacli timeout even 300 seconds.

 

From: Remi Duraffort <remi.duraffort@linaro.org>
Sent: Tuesday, September 15, 2020 8:14 PM
To: Larry Shen <larry.shen@nxp.com>
Cc: Milosz Wasilewski <milosz.wasilewski@linaro.org>; lava-users@lists.lavasoftware.org
Subject: Re: [Lava-users] [EXT] Re: Issues about XMLRPC & lavacli.

 

Caution: EXT Email

Have you tried increasing the lavacli default timeout. Maybe the network connection to the server is flaky?

 

 

Rgds

 

Le mar. 15 sept. 2020 à 12:06, Larry Shen <larry.shen@nxp.com> a écrit :

I just checked the log, looks there is nothing in server log.

Just, we use container master, and get next from docker logs, these happened when user sometimes submit job failure, will it possible be the cause? What does it mean?

{"log":"10.193.108.249 - - [15/Sep/2020:02:12:01 +0000] \"POST /RPC2 HTTP/1.1\" 200 587 \"-\" \"lavacli v0.9.7\"\n","stream":"stdout","time":"2020-09-15T02:12:02.222152631Z"}
{"log":"10.192.244.28 - - [15/Sep/2020:02:12:01 +0000] \"GET /scheduler/job/109711/job_status HTTP/1.1\" 200 634 \"http://lava-master.sw.nxp.com/scheduler/job/109711\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.102 Safari/537.36\"\n","stream":"stdout","time":"2020-09-15T02:12:02.222155609Z"}
{"log":"10.192.244.28 - - [15/Sep/2020:02:12:01 +0000] \"GET /scheduler/job/109711/log_pipeline_incremental?line=102 HTTP/1.1\" 200 6071 \"http://lava-master.sw.nxp.com/scheduler/job/109711\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.102 Safari/537.36\"\n","stream":"stdout","time":"2020-09-15T02:12:02.222178222Z"}
{"log":"10.193.108.249 - - [15/Sep/2020:02:12:02 +0000] \"POST /RPC2 HTTP/1.1\" 200 433 \"-\" \"lavacli v0.9.7\"\n","stream":"stdout","time":"2020-09-15T02:12:02.22218312Z"}
{"log":"ERROR:linaro-django-xmlrpc-dispatcher:Internal error in the XML-RPC dispatcher while calling method 'scheduler.jobs.show' with ('Unable',)\n","stream":"stderr","time":"2020-09-15T02:12:02.670767388Z"}
{"log":"Traceback (most recent call last):\n","stream":"stderr","time":"2020-09-15T02:12:02.670790089Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/linaro_django_xmlrpc/models.py\", line 441, in dispatch\n","stream":"stderr","time":"2020-09-15T02:12:02.670793859Z"}
{"log":"    return impl(*params)\n","stream":"stderr","time":"2020-09-15T02:12:02.670796867Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/lava_scheduler_app/api/jobs.py\", line 383, in show\n","stream":"stderr","time":"2020-09-15T02:12:02.67079951Z"}
{"log":"    job = TestJob.get_by_job_number(job_id)\n","stream":"stderr","time":"2020-09-15T02:12:02.670802465Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/lava_scheduler_app/models.py\", line 2010, in get_by_job_number\n","stream":"stderr","time":"2020-09-15T02:12:02.670805034Z"}
{"log":"    job = query.get(pk=job_id)\n","stream":"stderr","time":"2020-09-15T02:12:02.670807695Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/manager.py\", line 85, in manager_method\n","stream":"stderr","time":"2020-09-15T02:12:02.670810275Z"}
{"log":"    return getattr(self.get_queryset(), name)(*args, **kwargs)\n","stream":"stderr","time":"2020-09-15T02:12:02.67081302Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/query.py\", line 371, in get\n","stream":"stderr","time":"2020-09-15T02:12:02.670815631Z"}
{"log":"    clone = self.filter(*args, **kwargs)\n","stream":"stderr","time":"2020-09-15T02:12:02.67081854Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/query.py\", line 787, in filter\n","stream":"stderr","time":"2020-09-15T02:12:02.670821466Z"}
{"log":"    return self._filter_or_exclude(False, *args, **kwargs)\n","stream":"stderr","time":"2020-09-15T02:12:02.670833942Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/query.py\", line 805, in _filter_or_exclude\n","stream":"stderr","time":"2020-09-15T02:12:02.670836896Z"}
{"log":"    clone.query.add_q(Q(*args, **kwargs))\n","stream":"stderr","time":"2020-09-15T02:12:02.670839627Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/sql/query.py\", line 1250, in add_q\n","stream":"stderr","time":"2020-09-15T02:12:02.670842124Z"}
{"log":"    clause, _ = self._add_q(q_object, self.used_aliases)\n","stream":"stderr","time":"2020-09-15T02:12:02.670844738Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/sql/query.py\", line 1276, in _add_q\n","stream":"stderr","time":"2020-09-15T02:12:02.670847208Z"}
{"log":"    allow_joins=allow_joins, split_subq=split_subq,\n","stream":"stderr","time":"2020-09-15T02:12:02.670849936Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/sql/query.py\", line 1210, in build_filter\n","stream":"stderr","time":"2020-09-15T02:12:02.670852387Z"}
{"log":"    condition = self.build_lookup(lookups, col, value)\n","stream":"stderr","time":"2020-09-15T02:12:02.670855092Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/sql/query.py\", line 1104, in build_lookup\n","stream":"stderr","time":"2020-09-15T02:12:02.670857556Z"}
{"log":"    return final_lookup(lhs, rhs)\n","stream":"stderr","time":"2020-09-15T02:12:02.67086024Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/lookups.py\", line 24, in __init__\n","stream":"stderr","time":"2020-09-15T02:12:02.670862672Z"}
{"log":"    self.rhs = self.get_prep_lookup()\n","stream":"stderr","time":"2020-09-15T02:12:02.670877253Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/lookups.py\", line 74, in get_prep_lookup\n","stream":"stderr","time":"2020-09-15T02:12:02.670880038Z"}
{"log":"    return self.lhs.output_field.get_prep_value(self.rhs)\n","stream":"stderr","time":"2020-09-15T02:12:02.670882674Z"}
{"log":"  File \"/usr/lib/python3/dist-packages/django/db/models/fields/__init__.py\", line 966, in get_prep_value\n","stream":"stderr","time":"2020-09-15T02:12:02.670886231Z"}
{"log":"    return int(value)\n","stream":"stderr","time":"2020-09-15T02:12:02.67088904Z"}
{"log":"ValueError: invalid literal for int() with base 10: 'Unable'\n","stream":"stderr","time":"2020-09-15T02:12:02.670891628Z"}
{"log":"\n","stream":"stdout","time":"2020-09-15T02:12:03.22232776Z"}
{"log":"==\u003e gunicorn.log \u003c==\n","stream":"stdout","time":"2020-09-15T02:12:03.222350349Z"}

-----Original Message-----
From: Milosz Wasilewski <milosz.wasilewski@linaro.org>
Sent: Tuesday, September 15, 2020 5:39 PM
To: Larry Shen <larry.shen@nxp.com>
Cc: lava-users@lists.lavasoftware.org
Subject: Re: [EXT] Re: [Lava-users] Issues about XMLRPC & lavacli.

Caution: EXT Email

On Tue, 15 Sep 2020 at 10:32, Larry Shen <larry.shen@nxp.com> wrote:
>
> Hi, Milosz,
>
> See this:
> https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgit.
> lavasoftware.org%2Flava%2Flava%2F-%2Fmerge_requests%2F1286%2Fdiffs%236
> 5156a95098dc512e7a4b7047ea511332947f649&amp;data=02%7C01%7Clarry.shen%
> 40nxp.com%7Cc3cc54011c6f40b1781908d8595b2235%7C686ea1d3bc2b4c6fa92cd99
> c5c301635%7C0%7C0%7C637357595236580077&amp;sdata=wY2cJ%2F6kyfgkusZylxF
> QROEF8Dwew%2FvQg%2FaiBsPU8Og%3D&amp;reserved=0
>
> In fact I don't care if I can download big logs.
> Just care the issue "job submit failure", mentioned big log download just looks maybe related to same issue...
> I'm not sure what happened here, our environment or lava code change related...?

Looks really weird. We're also running eventlet gunicorn and it actually improved things a lot (no more weird timeouts). Maybe Remi has some better idea.

milosz

>
> -----Original Message-----
> From: Milosz Wasilewski <milosz.wasilewski@linaro.org>
> Sent: Tuesday, September 15, 2020 5:28 PM
> To: Larry Shen <larry.shen@nxp.com>
> Cc: lava-users@lists.lavasoftware.org
> Subject: [EXT] Re: [Lava-users] Issues about XMLRPC & lavacli.
>
> Caution: EXT Email
>
> On Tue, 15 Sep 2020 at 04:04, Larry Shen <larry.shen@nxp.com> wrote:
> >
> > Meanwhile, a strange issue in case help:
> >
> >
> >
> > In the past, when download big logs on the web, if the log too big, it should be timeout, then failed to download.
> >
> > But, now, we are still timeout in 2020.08, isn
t it should be ok with async worker?
> >
> > What
s your expect with async for this big file download? Possible our local setting issues?
>
> I'm not sure if this was enabled by default. In the /lib/systemd/system/lava-server-gunicorn.service you should have WORKER_CLASS set to 'eventlet'. If this is not the case it's most likely the source of your trouble.
>
> milosz
>
> >
> >
> >
> > From: Larry Shen
> > Sent: Tuesday, September 15, 2020 10:52 AM
> > To: lava-users@lists.lavasoftware.org
> > Subject: Issues about XMLRPC & lavacli.
> >
> >
> >
> > Hi, guys,
> >
> >
> >
> > We find an issue related to job submit:
> >
> >
> >
> > 1) One team use
lavacli to submit request, and sometimes it will report next:
> >
> >
> >
> > 07-Sep-2020 16:37:35        Unable to connect: HTTPConnectionPool(host='lava-master.sw.nxp.com', port=80): Read timed out. (read timeout=20.0)
> >
> >
> >
> > Looks this error happens at next, what do you think about this issue?
> >
> >
> >
> > try:
> >
> >             # Create the Transport object
> >
> >             parsed_uri = urlparse(uri)
> >
> >             transport = RequestsTransport(
> >
> >                 parsed_uri.scheme,
> >
> >                 config.get("proxy"),
> >
> >                 config.get("timeout", 20.0),
> >
> >                 config.get("verify_ssl_cert", True),
> >
> >             )
> >
> >             # allow_none is True because the server does support it
> >
> >             proxy = xmlrpc.client.ServerProxy(uri, allow_none=True,
> > transport=transport)
> >
> >             version = proxy.system.version()
> >
> >         except (OSError, xmlrpc.client.Error) as exc:
> >
> >             print("Unable to connect: %s" % exc2str(exc))
> >
> >             return 1
> >
> >
> >
> > 2) Another team write their own python code using XMLRPC to submit job, did something like next, it reports next:
> >
> >
> >
> > ERROR in XMLRPC.py:submitJob:63 msg: Failed to submit job, reason: <ProtocolError for chuan.su:chuan.su@lava-master.sw.nxp.com/RPC2: 502 Bad Gateway>!
> >
> >
> >
> > try:
> >
> >                 job_id = self.connection.scheduler.submit_job(job)
> >
> >                 self.logger.debug("Successed to submit job , job_id:
> > %d, platform; %s!",job_id,platform)
> >
> >                 return job_id
> >
> >             except Exception as e:
> >
> >                 self.logger.error("Failed to submit job, reason:
> > %s!",str(e))
> >
> >                 return None
> >
> >
> >
> > We are currently using lava server version 2020.08, guys told me in the past days, we also encountered similar, but with very low probability. But recently it becomes very high probability.
> >
> > I
d like to know if possible this will related to your changes to gunicorn eventlet? Or other possible reasons?
> >
> >
> >
> > Thanks,
> >
> > Larry
> >
> > _______________________________________________
> > Lava-users mailing list
> > Lava-users@lists.lavasoftware.org
> > https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fli
> > st
> > s.lavasoftware.org%2Fmailman%2Flistinfo%2Flava-users&amp;data=02%7C0
> > 1%
> > 7Clarry.shen%40nxp.com%7C99817af3bba04995041108d85959a9d4%7C686ea1d3
> > bc
> > 2b4c6fa92cd99c5c301635%7C0%7C0%7C637357588918656179&amp;sdata=6vvO2U
> > KM
> > VGHQtyf8eVb3eK7Qw9nuhUwwJUt4qXL%2BSTA%3D&amp;reserved=0
_______________________________________________
Lava-users mailing list
Lava-users@lists.lavasoftware.org
https://lists.lavasoftware.org/mailman/listinfo/lava-users


 

--

Rémi Duraffort

LAVA Architect

Linaro


 

--

Rémi Duraffort

LAVA Architect

Linaro



--
Rémi Duraffort
LAVA Architect
Linaro