This article has been moved to its new home here: https://benperk.github.io/msdn/2014/2014-09-using-logparser-to-analyze-the-eventlog-xml-azure-websites.html
If you have an active Azure Web App and you experience problems or want to do some checks on what might the problem, there is a file called EVENTLOG.XML which contain exceptions and information that may be helpful. The file is located in the /LogFiles directory on your Web App. Figure 1 illustrates how that might look when using an FTP client tool like FileZilla.
Figure 1, EVENTLOG.XML file for an Azure App Service Web App
As this file can be large, you might consider using a tool like LOGPARSER to perform the analysis as manual analysis might cause your eyes to hurt. I wrote a troubleshooting article about this tool here, it focuses on IIS logs, but you can use it for checking XML files as well.
The selectable columns are:
![]()
An example LogParser query you can use is like the following, with the result shown in Figure 2.
Vipre update 2019. LOGPARSER “SELECT Name, EventId, Count(*) FROM EVENTLOG.XML GROUP BY Name, EventId” i:xml
Figure 2, Analyzing Azure App Service Web App EVENTLOG.XML file using LogParser
If you see an abundant number of exceptions in the file, you can then write another query to look at the data for these entries. For example, a query to see what the Data is for all the ASP.NET entries where EventId = 1309, as shown below and in Figure 3.
LOGPARSER “SELECT Name, Data FROM EVENTLOG.XML WHERE EventId=’1309’” i:xml
Figure 3, Logparser query for Azure App Service Web App EVENTLOG.XML file analysis
This can help isolate and ultimately find the reason for poor performance or unexpected performance on your Web App.
I use LogParser Lizard often to perform queries, which you can use in this context too. Simply select XML Input Format as shown in Figure 4, and execute the same queries.
Figure 4, using LogParser Lizard to view Azure App Service Web App EVENTLOG.XML logs
You can also manually view the event logs within the KUDU console, as I describe here. You need to add /Support to the end of your KUDU URL or select the Support option from the Tools menu, as shown in Figure 5.
Figure 5, using KUDU to view Azure App Service Web App event logs logs
Then select the Analyze feature then Event Viewer, as shown in Figure 6. The contents displayed here are the same that are in the EVENTLOG.XML file.
Figure 6, using KUDU to view Azure App Service Web App event logs logs
Then you can scroll through the entries manually.
I'm developing a Spark/Scala Application that can read and parse a custom log file. I'm having trouble parsing multi-line log entries. Here's a snippet of my code:
Some of the entries in logfile span multiple lines. The regex works fine for single line entries but when a multi-line entry is read like the one shown below,
I receive this error:
Cannot parse log line:com.xxx.common.service.ServiceException: system failure: Unable to connect to ANY server: LdapDataSource{id=xxx, type=xxx, enabled=true, name=xxx, host=xxx port=999, connectionType=ssl, username=xxx, folderId=99999}
How can I get Spark to read multi-line log entries from a log file?
c-rod
c-rodc-rod
1 Answer
Since input files are small you can use zero323zero323
SparkContext.wholeTextFiles .
182k4444 gold badges537537 silver badges603603 bronze badges
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Not the answer you're looking for? Browse other questions tagged regexscalaapache-sparkmultiline or ask your own question.
I'm trying to use Log Parser within PowerShell to export a Windows Evtx log file to CSV:
But when I run this I get an error:
Error: detected extra argument '*' after query
The error code is 13. I tried putting the paths in single quotes and running it from the same directory as the logs but it keeps returning the same error.
Tony Hinkle
4,30077 gold badges1616 silver badges3131 bronze badges
smwksmwk
1 Answer
You need to preserve the double quotes around the query string, otherwise it won't be recognized as a single argument by the spawned process.
Putting the query string (with double quotes) in single quotes might work:
However, a much simpler solution to the problem would be to avoid Ansgar WiechersAnsgar Wiechers
Start-Process entirely and use the call operator (& ) instead:
150k1515 gold badges140140 silver badges199199 bronze badges
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Not the answer you're looking for? Browse other questions tagged powershellevent-loglogparser or ask your own question.
In 0.9.0 to view worker logs it was simple, they where one click away from the spark ui home page.
Now (1.0.0+) I cannot find them. Furthermore the Spark UI stops working when my job crashes! This is annoying, what is the point of a debugging tool that only works when your application does not need debugging. According to http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-td12023.html I need to find out what my master-url is, but I don't how to, spark doesn't spit out this information at startup, all it says is:
and obviously
http://yarn-client:8080 doesn't work. Some sites talk about how now in YARN finding logs has been super obfuscated - rather than just being on the UI, you have to login to the boxes to find them. Surely this is a massive regression and there has to be a simpler way??
How am I supposed to find out what the master URL is? How can I find my worker (now called executor) logs?
samthebestsamthebest
20.4k1414 gold badges7676 silver badges115115 bronze badges
2 Answers
Depending on your configuration of YARN NodeManager log aggregation, the spark job logs are aggregated automatically. Runtime log is usually be found in following ways:
Spark Master Log
If you're running with yarn-cluster, go to YARN Scheduler web UI. You can find the Spark Master log there. Job description page 'log' button gives the content.
Star jalsha irabotir chupkotha gillitv 17th march 2019 episode. With yarn-client, the driver runs in your spark-submit command. Then what you see is the driver log, if log4j.properties is configured to output in stderr or stdout.
Spark Executor Log
Search for 'executorHostname' in driver logs. See comments for more detail.
Log Parser Downloadsamthebest
20.4k1414 gold badges7676 silver badges115115 bronze badges
suztomosuztomo
3,15711 gold badge1414 silver badges1818 bronze badges
These answers document how to find them from command line or UI
For UI, on an edge node
Look in /etc/hadoop/conf/yarn-site.xml for the yarn resource manager URI (
yarn.resourcemanager.webapp.address ).
Visual Log Parser
Or use command line:
Community♦
samthebestsamthebest
20.4k1414 gold badges7676 silver badges115115 bronze badges
Not the answer you're looking for? Browse other questions tagged apache-sparkyarn or ask your own question.Comments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |