Menu

Feature Request

SimonP
2007-10-17
2013-04-24
  • SimonP

    SimonP - 2007-10-17

    Hi,

    i've just installed the software and I think it is so far the best application I have found for the job.

    There is just one missing feature that would make this perfect - the ability to export the reports as a CSV file (so I can wave it at my manager).

    I need to be able to pull of a report that shows the top 5 (for example) bandwidth hogs, with the sites visited and the bandwidth used. This would be very easy to do with a CSV.

    Any help you could give would be much appreciated. I'm happy to manually run an SQL script on the DB and export the reults as csv, but as far as writing the query goes I'm a bit stuck.

    Thanks

    Si

     
    • Ioannis Stoilis

      Ioannis Stoilis - 2007-10-18

      Indeed, useful feature. Noted.

      As for SQL, tell me examply what the output will like in field names and what you exactly you want fetched, and I will write an SQL query for you.

       
      • Ioannis Stoilis

        Ioannis Stoilis - 2007-10-18

        Sorry, instead of examply I meant exactly.

         
    • SimonP

      SimonP - 2007-10-18

      Hi,

      Thanks for the quick reply.

      The reports I would find most useful would be...

      Highest 10 bandwidth users, with fields 'username', 'host', 'bytes', 'number of sites'

      Most visited sites, with fields 'site', 'bytes', 'number of visits'

      URL listing per user (as in select ...... where user = '...'), with fields 'time', 'site', 'bytes' (Could this be a list of URLs as shown in the "details" section of the interface instead of just root domain)

      Thanks for your help.

      Si

       
      • Ioannis Stoilis

        Ioannis Stoilis - 2007-10-24

        Sorry, I forgot about this post...!

        Per what time period do you want the SQL queries?

         
      • Ioannis Stoilis

        Ioannis Stoilis - 2007-10-24

        > Highest 10 bandwidth users, with fields 'username', 'host', 'bytes', 'number of sites'

        SELECT DATE_ADD(CURDATE(), INTERVAL -1 DAY) AS mydate,INET_NTOA(trafficSummaries.ip) AS hostip,hostnames.hostname AS hostname,users.authuser AS username,SUM(trafficSummaries.inCache+trafficSummaries.outCache) as bytes,TRUNCATE((SUM(trafficSummaries.inCache)/SUM(trafficSummaries.inCache+trafficSummaries.outCache))*100,0) AS cachePercent,COUNT(DISTINCTROW(trafficSummaries.sitesID)) AS sites FROM trafficSummaries LEFT JOIN hostnames ON trafficSummaries.ip=hostnames.ip LEFT JOIN users ON trafficSummaries.usersID=users.id AND trafficSummaries.date=users.date WHERE trafficSummaries.date=DATE_ADD(CURDATE(), INTERVAL -1 DAY) GROUP BY trafficSummaries.ip,trafficSummaries.usersID ORDER BY bytes DESC LIMIT 10;

        > Most visited sites, with fields 'site', 'bytes', 'number of visits'

        I am not sure by what you mean "visits", so I just count individual users:
        SELECT DATE_ADD(CURDATE(), INTERVAL -1 DAY) AS mydate,sites.site AS site,SUM(trafficSummaries.inCache+trafficSummaries.outCache) AS bytes,COUNT(trafficSummaries.usersID) FROM trafficSummaries JOIN sites ON trafficSummaries.date=sites.date AND trafficSummaries.sitesID=sites.id WHERE trafficSummaries.date=DATE_ADD(CURDATE(), INTERVAL -1 DAY) GROUP BY trafficSummaries.sitesID ORDER BY bytes DESC LIMIT 10;

        > URL listing per user (as in select ...... where user = '...'), with fields 'time', 'site', 'bytes' (Could this be
        > a list of URLs as shown in the "details" section of the interface instead of just root domain)

        Be careful, this may take some time:
        SELECT DATE_ADD(CURDATE(), INTERVAL -1 DAY) AS mydate,traffic.time AS time,traffic.bytes AS bytes,traffic.url AS url,traffic.resultCode AS resultCode FROM traffic JOIN users ON usersID=users.id WHERE traffic.date=DATE_ADD(CURDATE(), INTERVAL -1 DAY) AND users.authuser='-' ORDER BY traffic.time DESC LIMIT 10;

        All queries are predefined to report on the previous day. I guess you know enough to customize them further yourself, right?

         
    • SimonP

      SimonP - 2007-10-24

      No problem...

      Just daily please.

       
    • SimonP

      SimonP - 2007-10-24

      That's fantastic.

      Thanks

       
    • eldersouza

      eldersouza - 2008-11-03

      Hi Giannis,

      First: Sorry for my english ok!

      Second: Paging results
      What do you think about Paging all reports? I have a site in my company that have so much access to the internet that i can't show IpSitesSummary case because my squid server don't support this... i don't have so much memory to put a million of registers in a vector to show later!! Mysteriously, the squid process crash! I don't know exactly why, I think the vector used all computer memory. In small results I can show the page and Squid don't crash.

      You use a vector (pageVars) to put all data and then show the result calling class smarty... Why don't you paging the results in case instead put all in a vector and show later?

      Can you put an option to select between 10, 20, 50, 100, 200 and ALL results in a page?

      Third: Can you put a search option in all page to be more specific in the results? Like "show all clients that accessed sites containing the word porn"... Then I write the word PORN in a field and click submit and MySAR return Client A, B, C accessed http://pornnow.ru, http://allpornsex.cz... Do you understand?

      Thanx for all!

       

Log in to post a comment.