|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
PDF conversion call is throwing below exception when idle for a while (more than a day) and running again after app service restart. Below code snippet placed in Web API controller and its hosted on Azure app service running on 32 bit.
Code: C#
try
{
HtmlToPdfOptions op = new HtmlToPdfOptions();
op.HeaderHtmlFormat = headerDecodeString;
op.FooterHtmlFormat = footerDecodeString;
HtmlToPdf.Options.NoCache = false;
op.StartPageIndex = 1;
op.RepeatTableHeaderAndFooter = true;
if (orientation.Equals("portrait"))
{
op.PageSize = new SizeF(PdfPageSizes.Letter.Width, PdfPageSizes.Letter.Height);
op.OutputArea = new RectangleF(0.4f, 0.7f, 7.7f, 9.5f);
}
else
{
op.PageSize = new SizeF(PdfPageSizes.Letter.Height, PdfPageSizes.Letter.Width);
op.OutputArea = new RectangleF(0.4f, 0.8f, 10.2f, 7f);
}
MemoryStream pdfStream = new MemoryStream();
{
HtmlToPdf.ConvertHtml(contentDecodeString, pdfStream, op);
byte[] pdfFileBytes = pdfStream.ToArray();
MemoryStream outPutStream = new MemoryStream(pdfFileBytes);
response.Content = new PushStreamContent((responseStream, httpContent, tc) =>
{
outPutStream.CopyTo(responseStream);
responseStream.Close();
}, "application/octet-stream");
HttpStatusCode successStatus = HttpStatusCode.Created;
response.StatusCode = successStatus;
response.Content.Headers.ContentType = new MediaTypeHeaderValue("application/pdf");
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
response.Content.Headers.ContentDisposition.FileName = "sample.pdf";
response.Content.Headers.ContentLength = new MemoryStream(pdfFileBytes).Length;
HtmlToPdf.ClearResult();
return response;
}
}
catch (Exception ex)
{
throw ex;
}
Code: C#
Message : Child process not ready.
Call Stack :
EO.Internal.jq+e:
at AP_PDF.Controllers.PDFController.HtmlToPDF (AP!PDF, Version=1.0.0.0, Culture=neutral, PublicKeyToken=nullAP!PDF, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null: D:\EO PDF Project Git\pdfgeneration-development\AP!PDF\Controllers\PDFController.csAP!PDF, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null: 130)
at lambda_method (Anonymously Hosted DynamicMethods Assembly, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null)
at System.Web.Http.Controllers.ReflectedHttpActionDescriptor+ActionExecutor+<>c__DisplayClass6_2.<GetExecutor>b__2 (System.Web.Http, Version=5.2.7.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
at System.Web.Http.Controllers.ReflectedHttpActionDescriptor+ActionExecutor.Execute (System.Web.Http, Version=5.2.7.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ExecuteAsync (System.Web.Http, Version=5.2.7.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089)
at System.Web.Http.Controllers.ApiControllerActionInvoker+<InvokeActionAsyncCore>d__1.MoveNext (System.Web.Http, Version=5.2.7.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089)
at System.Web.Http.Controllers.ActionFilterResult+<ExecuteAsync>d__5.MoveNext (System.Web.Http, Version=5.2.7.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089)
at System.Web.Http.Dispatcher.HttpControllerDispatcher+<SendAsync>d__15.MoveNext (System.Web.Http, Version=5.2.7.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Have you tried to set EO.Base.Runtime.EnableEOWP to true? See here for more details: https://www.essentialobjects.com/doc/common/eowp.aspxThanks
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Team,
Set EO.Base.Runtime.EnableEOWP = true in application_start but still seeing the same issue after idle for few hours.
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Hi,
What version do you use?
Thanks
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Version - 18.3.46.0
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
We are not sure what else to tell you. Last time you had problem with Azure App Service we were not able to reproduce it in our environment and I assume some how you have resolved the issue yourself. If somehow you can give us access to your system (ideally a separate test system since we may need to replace files/restart during our troubleshoot process), we would be very happy to investigate further there.
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Team,
Sent you the details on your support mail box, please let me know if needed any more details.
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Team,
Please let us know whether you have received the details shared on support mail box and did you get chance to replicate the issue.
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Hi,
We did receive it and we have been working on it. Please keep the system available. We will let you know as soon as we find anything or need anything else.
Thanks!
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Team,
Are you able to replicate the issue and any resolution for this.
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Hi,
We are able to replicate it with the old build. Right now we are testing this issue with the new 2019 build (we have already updated your server with the new build) and it does not seem to happen any more. However we still noticed a few things that are not normal through our internal debug tools. Since it takes a while to run a single test (as you know the original issue takes more than a day to reproduce), so the whole process has been taking a long time. Please bear with us and as soon as we have reached a definite conclusion we will reply here again.
Thanks!
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Any further update on this.
We also tried using 2019 version dll but did not see any luck, it has thrown exception after idle for 12-15 hrs.
Please help us in resolving this asap.
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Yes. We are able to reproducing the problem with 2019 build as well. We are adding debug features into our DLLs and trying to run it on your server. Most recently we added this feature: https://www.essentialobjects.com/doc/eo.base.runtime.startdebugmonitor.aspxCurrently we have this running on your server. This allows us to see some insight but the debug information we retrieved still puzzles us. We are adding more debug information, copy it over to your server and then wait for a day to get something back. Because every time we change something we have to wait for a day for it to happen, the whole process has been taking long. Do you know if there is any settings on app service that can reduce this wait time? If we can manually trigger what happens after 12 to 15 hours it would be of great help.
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
We have not find any setting to reduce the time span to replicate the issue.
Do you have any alternate solution to keep the pdf generation active as we are going live by this weekend.
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Hi,
So far we have only narrowed down the problem to our "book keeping" process. This is the first eowp.exe that we started (If you see multiple eowp.exe, you can identify the process by checking the command line arguments of those processes. The book keeping has command line argument "--eoim"). This process runs a named pipe server that accepts connections from other eowp.exe processes. The problem appears to be that for some reason this named pipe server run into problem and would no longer accepts connections.
We have been attempting adding logs to this process and collecting the logs from your server. However as we add more logs, it might have changed certain timing factors so the current build running on your server (build .65) actually no longer demonstrates the problem for almost 2 days. We will check again tomorrow to see if it demonstrates the problem.
We would not have a permanent solution until we find out exactly how it stops receiving connections for sure. In the mean time as a workaround, you can simply kill this process pragmatically. Our library is designed to be able to recover and automatically recreate this process. As such you can add a timer in your code that searches for and kill this particular eowp.exe process and then rely on the recovery mechanism for this process to be automatically recreated.
Please keep in mind that automatically killing our worker process may cause conversions that are currently in progress to fail. However in reality the chance of this happening is low because this particular process is primarily used when other child processes are created. So most of the time this process is just sitting idle and if you kill it when it's idle, it won't have any impact on your application. In fact you can even kill all eowp.exe processes and they will be automatically recreated. However doing so would significantly increase the chance of a current conversion being interrupted.
We will update you as soon as we have more information on this issue.
Thanks!
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
As you said in last response, it was working fine even after 2 days idle. Any further update on the fix?
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Hi,
We have been testing it every day and today is the 5th day and it is still working fine. The problem is we do not know why. We will have a new build out either today or tomorrow with the new code and you can switch to that build. In the mean time we will keep testing and hopefully we can find a definitive answer on exactly what caused it.
Thanks!
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
I took the latest version and deployed in two different app service, one is working fine after idle for 3 days and other one is throwing the same exception.
Any code changes need to apply in order this to work and any further update on your testing.
|
|
Rank: Administration Groups: Administration
Joined: 5/27/2007 Posts: 24,258
|
Hi, Can you try to call the following two methods in your Application_Start on the AppService that's not working: https://www.essentialobjects.com/doc/eo.base.runtime.setlogsize.aspxhttps://www.essentialobjects.com/doc/eo.base.runtime.startdebugmonitor.aspxThe first method set our internal log size. Give it a large value such as 100000. For the second method you can pass a unique name and then PM us the name. Keep it running until the problem occurs again, we will then use our remote debugger interface to collect the logs and see what we can find. Thanks!
|
|
Rank: Advanced Member Groups: Member
Joined: 12/14/2018 Posts: 31
|
Any further update on the permanent fix for this.
Still I am seeing the issues after idle for few days and I don't see any requests from your end on the azure app service which we shared with you.
|
|