This project focuses on Supported Right, a tool designed to streamline tasks for support engineers at a web development and design agency.
Manual management of over 100 sites causing inefficiency.
Implemented an error detection algorithm to identify website issues instantly.
Support tickets are now completed 95% faster.
During my time at Rooah! LLC…
I noticed the WordPress management team struggling with manually checking over a hundred sites daily, leading to inefficiency and wasted time.
To address this…
I created a website checker that scrapes sites and flags issues. The interface is user-friendly and provides clear troubleshooting directions.
For privacy, I implemented a simple passcode method using JavaScript.
var cacheTable = '';
var cacheForm = $("#form-con");
$("#form-con").remove();
var keyword = "";
var keyRef = db.collection("keys").doc("password");
keyRef.get().then((pass) => {
if (pass.exists) {
keyword = pass.data()["pass"];
} else {
console.log("No such document for keyword!");
}
}).catch((error) => {
console.log("Error getting keyword from db:", error);
});
$('<div>', {id : 'overlay'}).appendTo('body');
$("#auth").fadeIn('slow').center();
function handleAuth() {
var key = document.getElementById("keyword").value;
if (key !== keyword) {
alert("Wrong keyword!")
} else {
$("#overlay").remove();
$("#auth").remove();
updateTable();
}
}
Recognizing that sometimes websites load but their content doesn’t, I used Beautiful Soup to scrape the homepage, ensuring accurate diagnostics.
for w in websites:
currentTime = str(datetime.datetime.now() + datetime.timedelta(hours=1)).split('.')[0]
currentTimeGMT = currentTime + ' GMT'
try:
response = requests.get(f'https://{w}')
soup = BeautifulSoup(response.content, 'html.parser')
pageTags = soup.find_all()
except Exception as e:
sosWebsites.append(f'https://{w}')
db.collection(u'status').document(w).set({
u'checkedAt' : currentTimeGMT,
u'comment' : 'Probably A Domain Issue',
u'status' : 'Down',
u'website' : w,
}, merge = True)
else:
if response.status_code == 200 and len(pageTags) > 50:
upWebsites.append(f'https://{w}')
db.collection(u'status').document(w).set({
u'checkedAt' : currentTimeGMT,
u'comment' : 'Working',
u'status' : 'Up',
u'website' : w,
}, merge = True)
else:
sosWebsites.append(f'https://{w}')
db.collection(u'status').document(w).set({
u'checkedAt' : currentTimeGMT,
u'comment' : f'Website Error: {response.status_code}',
u'status' : 'Down',
u'website' : w,
}, merge = True)
I optimized the app for mobile, turning desktop rows into mobile cards for easier use.
Features like adding/deleting websites and links to each site for quick access improved overall efficiency.