Search Google with precision.
This little tool will help you with your Google searches. It uses Google Operators to search with precision the terms as you enter them in the program. It work with almost everything Google provide (Search Operators). You can search for files, sites/domains, search in URL, in title, in text… In order to provide the best results, some fields disable others, some Google Operators aren’t meant to be mixed up with others! It will launch the search results in your default browser, so you can keep working with the same browser.
A collection of my software utilties
some mini tools for internet handly used
some mini internet tools.webspider to download site pages and images and swf,.minidownload to download multi files from website.
It is to read IPs or Links to get more links.
This will read IPs or Links to get links. These links stored in the created MySQL database Each input file will create a MySQL database with two tables: Ranges and Links ===================== Load Input: 1. If input files contain lines of form: 188.8.131.52/1 => use Load IP/Mask or IPs 2. If input files contain lines of form: 184.108.40.206 => use Load IP/Mask or IPs 3. If input files contain lines of form: 220.127.116.11-18.104.22.168 => us Load Range It is ok to have spaces before -, so line can be 22.214.171.124 - 126.96.36.199 Load Link: 1. Lines are all links that can begin with or without http:// or https:// http://www.abcdefg012345.com https://www.abcdefg012345.com www.abcdefg012345.com Bad Input: Combinations of IPs , Links , Ranges 188.8.131.52/1 www.abcdefg012345.com Set up Data Source Name is the same as version 1 which is at https://sourceforge.net/p/jiprangescanner/wiki/Home/
Group file share with advanced text parsing capability for easy search
Originally created as a church resource sharing system, phpShare&Search allows users to create accounts, share documents, search documents, and like or report documents. phpShare&Search's power comes from its advanced document parser which extracts text from .PDF, .TXT, .DOC, and .DOCX files and its community features of liking resources and reporting them as inappropriate or SPAM. Users also subscribe to weekly updates of new content. User's may choose to download and host/install/configure/modify/manage this code themselves, or contract the code writer to do these functions for them. Contact me for a reasonable quote. eedrew <at> users <dot> sourceforge <dot> net To support future revisions and/or contribute based on the value you found from this code, checkout the External Link drop-down in the menu. Also, if you do not wish to create and maintain your own installation, email email@example.com for a quote on a turn key solution.
LinuxNews is a free software for Android 2.2 that allows you to get information about the Linux world, through RSS feeds, events and videos from YouTube. You can save content locally through the menu items and read them even without an active Internet connection.
A crawler which can get contents from web sites.
The crawler can crawl many types of web sites, including portals, digital newspapers, twitter-likes among others.
Referer spam (also known as log spam or referer bombing)
Required: - Php CLI - Php CURL Referer spam (also known as log spam or referer bombing) is a kind of spamdexing (spamming aimed at search engines). The technique involves making repeated web site requests using a fake referer URL that points to the site the spammer wishes to advertise. Sites that publicize their access logs, including referer statistics, will then inadvertently link back to the spammer's site. These links will be indexed by search engines as they crawl the access logs. This benefits the spammer because of the free link, which gives the spammer's site improved search engine ranking due to link-counting algorithms that search engines use.
A terminal-program for downloading torrents from PirateBay
Torrtux is a terminal-based program, written in perl for downloading torrents from The Pirate Bay. If you live in a country where tpb is blocked (UK, Fin, Be, etc.), you can set a proxy in the config file. With it you can get the magnet link of your torrent, copy it in the clipboard and open your torrent manager. All of that from your terminal ! It also allows you to get the details of your torrent, the author, the date, the type, the size, etc., just like being on the TPB site ! Moreover, it retrieves subs from www.opensubtitles.org. It retrieves informations in the source code of the TPB page and parses it with regexp and the library html-parser. In the config file ~/.torrtuxrc, you can chose your display, subs, comments preferences, your torrent-manager and a proxy if needed ! Thanks for reporting all bugs you find !
Added features from the community forums
This is an edited Version 1.3.5 of Sphider. Released under the GPLv3 Sphider 1.3.5 is required first. This package ONLY updates the original install of Sphider which can be found here. http://www.sphider.eu/download.php #What got me to work on this ? http://www.sphider.eu/forum/read.php?3,6156 "thanks, inmotion" Sphider-CV allows you to index using ".sh" scripts, but also retains the default features. This version does not work on WINDOWS servers due to it needing to run .sh files. It will take some modifications for it to run .bat files instead. I will also upload a WINDOWS version in the next few days.
A software used to crawler websites and make sitemaps.
Desenvolvido em Delphi X2 mas com total compatibilidade para Delphi 7, esta ferramenta vasculha um site determinado pelo usuario e atravez das tags "<a href='..." inicia um processo de Crawling no mesmo. Armazenando as informacoes em um banco de dados SQLli existe a possibilidade da criacao de um mapa do website. Developed in Delphi X2 but with full compatibility for Delphi 7, this tool scans a site given by the User and trough tags "<a href = '..." begins the process of Crawling therein. Storing information in a database SQLli enables the creation of a sitemap.
PHP-based directory manager with a little something extra
Ameeba allows you to build categorized, searchable directories for your website, but with one major difference. Every category and subcategory of your online catalog, or photo gallery, or link directory, or whatever, can be split up into as many taxonomies as you need in order to properly organize your data. Not only that, but you can browse by all of your taxonomies simultaneously. To see an example of this concept in action have a look at the Science page of the Open Directory Project (http://www.dmoz.org/Science/). The links are categorized by scientific field, again by a collection of miscellaneous categories, and again by language. Click a link in one taxonomy and the contents of the other taxonomies update to reflect a more refined search. In this way you are browsing by all three at once. Ameeba lets you build these same kinds of directories for yourself instead of locking you into a single taxonomy like other directory managers.
advanced search for mediawiki
This extension for mediawiki is based on Woogle and during the last year we improved and enhanced this extension. The main differences between Woogle and apMWsearch are the following: * no remote server, search is restricted to the included wiki * more file types can be searched * possibility to use different engines for indexing files * more flexible configuration for embedding into mediawiki
music web manager to read your mp3 playlist via http
Based at PHP Zend framework. Manager of mp3 files. It read your files from disk, and allow you to play them via http. Website is exporting winamp playlist to play at client computer.
Program for web-search by defenite sites and periods of time. Definition by user. Using: http://www.yandex.ru/, http://www.google.ru. Search achived by redirect search query to search services. In other words Bolter - wrapper of existing search services. Visit http://vk.com/bolter_app for more info.
Indexed cloud file system
icfs provides a way for mapping multiple URLs, referencing various web-enabled artifacts, into a single file system presentation under Linux. URLs can reference static pages, REST calls, or web objects in an object store.
Starts web search upon any selected text in any app with a hotkey.
If you mark some text in the browser and right-click it, there is an option in the context menu to "Search Google" for that selected phrase. Have you ever wondered, why isn't such an option present in many other every day apps? A solution arrives with Web Search Quickey! Now you can perform all your searches from nearly any application, all you need to do is to have some (preferably meaningful) text selected and press the hotkey. The search results will be instantly presented in your default browser (in a new tab, if it happens to be already open). Latest version allows to choose the search engine and define your preferred hotkey. You may need to add exception to your anti-virus, firewall etc. Please find more details in a readme file.
Robots.txt parsing library
Robots.io is a Java library designed to make parsing a websites 'robots.txt' file easy. The RobotsParser class provides all the functionality to use robots.io. Domains passed to RobotsParser are normalised to always end in a forward slash. Disallowed Paths returned will never begin with a forward slash. This is so that URL's can easily be constructed.
Pastebin Sniffer tool
Pastebin Sniffer can capture every public paste released in pastebin.com. This tool can parse the pastes by using a wordlist for only save the most interesting pastes for user.
GUI frontend for the a full-text search engine namazu (www.namazu.org)
Neko is a GUI for namazu: a full-text search engine (www.namazu.org).
Altabusca é um pequeno buscador similar ao Google.
Altabusca é um pequeno buscador similar ao Google, ele é capaz de indexar sites, e enquanto ele indexa ele descobre novos websites, que também podem ser indexados, totalmente feito em Php e com bando de dados Mysql, é só instalar e rodar. Totalmente grátis e Opensource.
Portable Open Source Intelligence (OSINT) Web Browser
Oryon OSINT Browser is a web browser designed to assist researchers in conducting Open Source Intelligence (OSINT) investigations. Oryon comes with dozens of pre-installed tools and a selected set of links cataloged by category. To safely use all Oryon Browser options, it is recommended that you take the following steps: 1. Check your browser privacy settings. Reset all settings if necessary or make the necessary changes. 2. Disable browsing history (extension: History On / Off) 3. Enable free proxy (extension: Hotspot Shield Free VPN Proxy) 4. Create a dedicated gmail account 5. Create dedicated accounts on Facebook and Twitter 6. Take a look at the collection of links 7. Check out the Oryon Tools bookmarklet 8. Check installed add-ons (extensions: Context, Extension Manager) 9. Save a copy of the QueryTool spreadsheet on your GoogleDrive and check its performance with your cases 10. Log in to the MIIS YouTube channel and watch practical video tutorials.
Google Search, Google Site Search, Google News from the terminal
googler is a power tool to Google (Web & News) and Google Site Search from the command-line. It shows the title, URL and abstract for each result, which can be directly opened in a browser from the terminal. Results are fetched in pages (with page navigation). Supports sequential searches in a single googler instance. googler was initially written to cater to headless servers without X. You can integrate it with a text-based browser. However, it has grown into a very handy and flexible utility that delivers much more. For example, fetch any number of results or start anywhere, limit search by any duration, define aliases to google search any number of websites, switch domains easily... all of this in a very clean interface without ads or stray URLs. The shell completion scripts make sure you don't need to remember any options. googler isn't affiliated to Google in any way. Demo: https://asciinema.org/a/85019
DuckDuckGo from the terminal
ddgr is a cmdline utility to search DuckDuckGo from the terminal. While googler is highly popular among cmdline users, in many forums the need of a similar utility for privacy-aware DuckDuckGo came up. DuckDuckGo Bangs are super-cool too! So here's ddgr for you! Unlike the web interface, you can specify the number of search results you would like to see per page. It's more convenient than skimming through 30-odd search results per page. The default interface is carefully designed to use minimum space without sacrificing readability. ddgr isn't affiliated to DuckDuckGo in any way. Demo: https://asciinema.org/a/151849