Jump to content
MakeWebGames

HaxXXxaH

Members
  • Posts

    4
  • Joined

  • Last visited

Everything posted by HaxXXxaH

  1. Need more info than that.. You're better off posting the entire script.
  2. exploding would have to be exploded on semicolons then on commas, but that defeats the purpose of searching on a userid without regex. And I don't know your limit on page execution time, but I prefer to keep mine all under .2 seconds. When looking through logs with hundreds of thousands of entries, it tends to be slow via database. using filesystem+regex => Total execution time in seconds: 0.045760869979858 (8) don't have it set up with the database to give an accurate timing, but it was well over .4 seconds, some admins are okay with slow pages, I however prefer speed over everything. Yes, mccodes is a poorly designed database, pretty sure the original coders have never even heard of an index
  3. try going to the link... it's not an image. http://s1104.photobucket.com/user/Jordan_Pollard/media/YWN.png
  4. If you're worried about wasting database resources logging every page load, assuming that's what you are trying to do. Probably building an anti-cheat type of system right? I would suggest skipping the database entirely and stick to text files. A lot faster than querying the database, especially when viewing the logs if there is enough of them. I used to log page loads via database and switched and won't be switching back. You can get a good 100k entries per day, which can be very taxing on the database. I also would scrap the whole idea of a page function. Update page in the same query you update lastactive, and last IP!, there saved you a query! Advantage of Text Files: Faster Speed, way faster if you have a lot of entries. Disadvantage: Makes it hard to query, but if you master regex you won't even notice this disadvantage. Here's how to append a file: (which would replace logging in the database.) [PATH] should be substituted with the path to your file.   $file = '[PATH]/actlog.txt'; $add = "$userid,{$_SERVER['REQUEST_URI']}," . time() . ";\n"; file_put_contents($file, $add, FILE_APPEND | LOCK_EX);   unfortunately you cannot prepend a file in PHP, makes it harder later on, but not impossible. #note to the people who want argue that you can prepend a file by loading the entire file into a variable and prepending the variable then saving the file, I am trying to use fewer resources not load 100k lines into a variable!! an example of an entry would look like: "1,/index.php,1467957322;" // 1 = user's id, /index.php = page loaded, 1467957322 = time they loaded the page rough idea of staff panel code: by rough, I mean you'll have to add your own pagination if you want, or timestamp constraints, but you can have it show everyone(don't set the $_GET['userid']) or you can narrow it down to 1 user by setting the $_GET['userid'].   function view_act_hour_logs() { print "<table style='width:100%;'> <tr> <th>Username</th> <th>Page</th> <th>Time</th> <th>Last</th> </tr>"; $file = '[PATH]/actlog.txt'; $current = file_get_contents($file); $usid = (isset($_GET['userid'])) ? $_GET['userid'] : "[0-9]*"; preg_match_all("/($usid),([^,]*),([0-9]*);/", $current, $out, PREG_PATTERN_ORDER); $out[1] = array_reverse($out[1]); //reversing because we couldn't prepend earlier!, these are the ID's $out[2] = array_reverse($out[2]); //reversing because we couldn't prepend earlier!, these are the Pages $out[3] = array_reverse($out[3]); //reversing because we couldn't prepend earlier!, These are the timestamps for ($i = 0; $i < count($out[1]); $i++) { $since = isset($last) ? $last - $out[3][$i] : 0; // see time since last load request, helps in tracking cheaters who use macros at a set time. $last = $out[3][$i]; $uid = (!isset($_GET['userid'])) ? $out[1][$i] : $usid; print "<tr><td>" . usernameGen($out[1][$i]) . "</td> <td>{$out[2][$i]}</td><td>" . date('F j Y g:i:s a', $out[3][$i]) . "</td><td>$since</td></tr>"; } print "</table>"; }   You can further improve on these ideas, but I'm not going to give you all my work :P. For instance on the hour cron you can copy the file to a different file for backlogs, then clear the current file, to prevent the file from getting too big. requires a bunch of edits to the staff panel then though. But there's how you can do what you want entirely without using anymore database resources than you were already using ;). Confusing as hell? I bet.
×
×
  • Create New...