Restart TomEE frequently because URL is not responding


Login or Register to Reply

 
Thread Tools Search this Thread
# 1  
Restart TomEE frequently because URL is not responding

Good morning, i need your help please
In a Production system an application comprises TomEE and the core application, and it is distributed among 4 nodes or servers

Many many times the URL for node1 its not responding, sometimes there is a delay of 2 mitutes to load,sometimes i get a 404 error, so the temporal solution is to restart tomEE almost daily, but this is not a good practice because this URl receives thousands https requests from internet.

behind those 4th nodes it is a load balancer, so these questions arise:

1 when and why should i restart tomEE? how frequently?
2 in Unix/Linux itself is a way to see for each node the number of request being received from the load balancer? so i can know if LB is sending the request in a balanced way to each node?

I am new to Unix and i have no idea about web servers applications, if u have any url to get documented
I appreciate your help in advanced
# 2  
any answer for the above questions ?

I appercaite your help in advanced
# 3  
Be warned that bumping posts are against the rules on this forum and moderators might sanction you for it.

Which load balancer (LB) are you using??

However, I would be likely to compare the 4 configurations (conf/server.xml) to explore any differences seen. TomEE still uses normal Tomcat Clustering and that needs checking out.

I would be inclined to guess that the LB is not overloading node1, on the contrary, the LB is continuing to send requests to node1 after that node has failed (for some reason). There is a fault (hardware, software or configuration) with node1.

The two minute delay you see to load the page is, in fact, the timeout before the request is successfully delivered to a surviving node. If everything is configured properly, it shouldn't take this long. Furthermore, once the failed node is kicked out of the cluster (until you issue the command for it to rejoin), the LB should know that node1 is dead and no longer send requests to it. You then run on three nodes after failover and all is fine and dandy.

So your problem is that failover is not occurring in an orderly fashion. I would confirm whether node1 is actually joining the cluster at all or you have a cluster of three plus a stand-alone node.

The way it should work is that the four cluster members should know immediately if one of them fails. The failed node gets kicked out. The surviving nodes relay this state to the LB which no longer sends requests to the failed node.

Which LB are you using??

Ref:
Apache TomEE
How to Setup Load Balancing in Tomcat/TomEE with mod_proxy_balancer – Simples Assim
Tomcat Clustering - A Step By Step Guide | MuleSoft
This User Gave Thanks to hicksd8 For This Post:
# 4  
Good afternoon, Sorry i apologized for bumped questions, i will take into account

the data ive got so far is this; LB is F5, method used:least connection, persistence method:source adress location, i dont understand this terminology but it was given by our Networking ADministrator

Thanks you mery much for your help and feedback given

--- Post updated at 12:17 PM ---

one more issue: The company that developed the application made a comparison among these 4 confogurations and mention and confirmedthat are the same configuration, but they can not find what is happenning.

so i come up with this questions.

by means if any command how can i know if a node is kicked out of a cluster or how is it rejoined?
Login or Register to Reply

|
Thread Tools Search this Thread
Search this Thread:
Advanced Search

More UNIX and Linux Forum Topics You Might Find Helpful
Reading URL using Mechanize and dump all the contents of the URL to a file
scott_cog
Hello, Am very new to perl , please help me here !! I need help in reading a URL from command line using PERL:: Mechanize and needs all the contents from the URL to get into a file. below is the script which i have written so far , #!/usr/bin/perl use LWP::UserAgent; use...... Shell Programming and Scripting
2
Shell Programming and Scripting
Awk: print all URL addresses between iframe tags without repeating an already printed URL
striker4o
Here is what I have so far: find . -name "*php*" -or -name "*htm*" | xargs grep -i iframe | awk -F'"' '/<iframe*/{gsub(/.\*iframe>/,"\"");print $2}' Here is an example content of a PHP or HTM(HTML) file: <iframe src="http://ADDRESS_1/?click=5BBB08\" width=1 height=1...... UNIX for Dummies Questions & Answers
18
UNIX for Dummies Questions & Answers
Regex to rewrite URL to another URL based on HTTP_HOST?
EXT3FSCK
I am trying to find a way to test some code, but I need to rewrite a specific URL only from a specific HTTP_HOST The call goes out to http://SUB.DOMAIN.COM/showAssignment/7bde10b45efdd7a97629ef2fe01f7303/jsmodule/Nevow.Athena The ID in the middle is always random due to the cookie. I...... Web Development
5
Web Development
url calling and parameter passing to url in script
gander_ss
Hi all, I need to write a unix script in which need to call a url. Then need to pass parameters to that url. please help. Regards, gander_ss... Shell Programming and Scripting
1
Shell Programming and Scripting
url calling and parameter passing to url in script
gander_ss
Hi all, I need to write a unix script in which need to call a url. Then need to pass parameters to that url. please help. Regards, gander_ss... UNIX for Advanced & Expert Users
1
UNIX for Advanced & Expert Users