Forum Topic
Squid+Debian Youtube caching
-
Squid storage de-duplication
This particular post is a way to share my experience regarding patching, building, and installing squid from source in order to cache youtube videos. references can be found at
http://coderstalk.blogspot.com/2011/08/caching-youtube-using-squid-caching.html
and
http://eu.squid-cache.org/ConfigExamples/DynamicContent/YouTube
let\'s get started shall we...
1. Install a Debian System update then install dhcp server and bind9.
apt-get update
apt-get install isc-dhcp-server bind9
we\'ll just use bind9 as a dns cache for now
next we\'ll configure debian as dhcp server. we will be assuming eth0 as your internet interface and eth1 as our LAN interface
configure interfaces
nano /etc/network/interfaces
allow hotplug eth0 eth1
iface eth0 inet dhcp
iface eth1 inet static
address 192.168.1.1
netmask 255.255.255.0
network 192.168.1.0
broadcast 192.168.1.255
save exit and reconfigure network
ifdown eth0
ifup eth0
ifup eth1
configure as to what interface dhcp server will run.
nano /etc/default/isc-dhcp-server
change the following lines
INTERFACES=\"\"
INTERFACES=\"eth1\"
configure info that dhcp server will deploy to clients
nano /etc/dhcp/dhcpd.conf
comment out the following lines
option domain-name
option domain-name-servers
default-lease-time
default max-lease-time
uncomment autoritative
uncomment and edit \"A alightly different configuration\" including the last curly brace. it should look like so:
subnet 192.168.1.0 netmask 255.255.255.0 {
range 192.168.1.2 192.168.1.200;
option domain-name-servers 192.168.1.1;
option routers 192.168.1.1;
option broadcast-address 192.168.1.255;
default-lease-time 600;
max-lease-time 7200;
}
save, close and start dhcp server
/etc/init.d/isc-dhcp-server start
2. Install squid source (use 2.7Stable9 version)
cd /usr/src
wget http://www.squid-cache.org/Versions/v2/2.7/squid-2.7.STABLE9.tar.gz
3. extract squid source
tar -xvzf squid*.tar.gz
Now that we have squid source let\'s start installing the utilities we need to build squid, oh btw, we will also be enabling ssl just in case we need it in the future.
1. install needed utilities.
apt-get install binutils sharutils ccze make automake gawk libssl-dev ssh openssh-server
2. we already have a dhcp server, let\'s make life easier by working from a windows client pc using putty. download putty on you windows client machine. dont know where? google it.
3. On your windows client machine open putty and enter the server address in our case 192.168.1.1 port 22, login as root now we can copy and paste commands. yay! ( or you can install webmin for a gui based interface ). i like using ssh though just because its fast clean and simple to use.
4. lets enable port forwarding using putty.
nano /proc/sys/net/ipv4/ip_forward
change 0 to 1. save and exit.
Now we\'re ready to configure and compile squid.
1. go to squid source directory
cd /usr/src/squid*
2. configure squid. if you\'re using putty you can copy and right click on your putty interface to paste commands.(gotta love putty for that) :D
./configure --prefix=/usr --exec_prefix=/usr --bindir=/usr/sbin --sbindir=/usr/sbin --libexecdir=/usr/lib/squid --sysconfdir=/etc/squid --localstatedir=/var/spool/squid --datadir=/usr/share/squid --enable-async-io --with-pthreads --enable-ssl --enable-storeio=ufs,aufs,coss,diskd,null --enable-linux-netfilter --enable-arp-acl --enable-epoll --enable-removal-policies=lru,heap --enable-snmp --enable-delay-pools --enable-htcp --enable-cache-digests --enable-referer-log --enable-useragent-log --enable-auth=basic,digest,ntlm,negotiate --enable-http-violations --enable-carp --enable-follow-x-forwarded-for --with-large-files --with-maxfd=65536
3. compile and install
make && make install
Whew! are we ready to run squid -z yet? sadly not yet we need to configure squid first set files folder permissions
1. create logs and cache directory.
cd /var/spool/squid
mkdir cache
mkdir logs
chown proxy:proxy *
2 edit squid.conf file. i do an mv then create an empty squid.conf file then paste a compact squid.conf i made in notepad or scite.
cd /etc/squid
mv squid.conf squid.conf.default
nano squid.conf
paste this:
acl all src all
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.1.0/24 # RFC1918 possible internal network
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localnet
http_access deny all
icp_access allow localnet
icp_access deny all
http_port 3128 transparent
hierarchy_stoplist cgi-bin ?
cache_mem 128 MB
maximum_object_size_in_memory 8192 KB
cache_dir ufs /var/spool/squid/cache 2048 32 256
minimum_object_size 0 KB
maximum_object_size 1024 MB
access_log /var/spool/squid/logs/access.log squid
cache_log /var/spool/squid/logs/cache.log
cache_store_log /var/spool/squid/logs/store.log
storeurl_rewrite_children 1
storeurl_rewrite_concurrency 10
refresh_pattern (get_video|videoplayback|videodownload|\\.flv).*(begin|start)\\=[1-9][0-9]* 0 0% 0
refresh_pattern -i \\.flv$ 9999999 999999% 9999999 ignore-no-cache override-expire ignore-private
refresh_pattern ^ftp: 1440 20% 10080 ignore-no-cache override-expire ignore-private
refresh_pattern ^http://[A-Za-z0-9]+\\.lscache[0-9]\\.c\\.youtube\\.com 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern ^http://[a-z0-9]+\\.youtube\\.com 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern ^http://[a-z]+\\.youtube\\.com 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern ^http://[a-z0-9]+\\.ytimg\\.com 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern ^http://*\\.youtube\\.com 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern get_video\\?video_id 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern youtube\\.com/get_video\\? 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern ^http://*.youtube.com/.* 9999999 999999% 999999999 ignore-no-cache override-expire ignore-private
refresh_pattern (get_video\\?|videoplayback\\?|videodownload\\?|\\.flv\\?|\\.fid\\?) 999999 999999% 999999 override-expire ignore-reload ignore-private negative-ttl=0
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\\?) 0 0% 0
refresh_pattern . 0 40% 4320
acl store_rewrite_list url_regex -i \\.youtube\\.com\\/get_video\\?
acl store_rewrite_list url_regex -i \\.youtube\\.com\\/videoplayback\\.youtube\\.com\\/videoplay \\.youtube\\.com\\/get_video\\?
acl store_rewrite_list url_regex -i \\.youtube\\.[a-z][a-z]\\/videoplayback\\.youtube\\.[a-z][a-z]\\/videoplay \\.youtube\\.[a-z][a-z]\\/get_video\\?
acl store_rewrite_list url_regex -i \\.googlevideo\\.com\\/videoplayback\\.googlevideo\\.com\\/videoplay \\.googlevideo\\.com\\/get_video\\?
acl store_rewrite_list url_regex -i \\.google\\.com\\/videoplayback\\.google\\.com\\/videoplay \\.google\\.com\\/get_video\\?
acl store_rewrite_list url_regex -i \\.google\\.[a-z][a-z]\\/videoplayback\\.google\\.[a-z][a-z]\\/videoplay \\.google\\.[a-z][a-z]\\/get_video\\?
acl store_rewrite_list url_regex -i (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\/videoplayback\\?
acl store_rewrite_list url_regex -i (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\/videoplay\\?
acl store_rewrite_list url_regex -i (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\/get_video\\?
acl store_rewrite_list url_regex -i http://video\\..*fbcdn\\.net.*\\.mp4.*
acl store_rewrite_list url_regex -i http://.[0-9]\\.[0-9][0-9]\\.channel\\.facebook\\.com/.*
acl store_rewrite_list url_regex -i http://.*\\.mp4?
acl store_rewrite_list url_regex -i http://www\\.facebook\\.com/ajax/flash/.*
acl store_rewrite_list url_regex -i http://.*\\.ak\\.fbcdn\\.net/.*
acl store_rewrite_list url_regex -i \\.geo.yahoo\\.com\\?
storeurl_access allow store_rewrite_list
storeurl_access deny all
storeurl_rewrite_program /etc/squid/storeurl.pl
quick_abort_min -1 KB
acl shoutcast rep_header X-HTTP09-First-Line ^ICY.[0-9]
upgrade_http0.9 deny shoutcast
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
cache_mgr [email protected]
cache_effective_user proxy
cache_effective_group proxy
snmp_port 3401
acl aclname snmp_community string
acl snmppublic snmp_community public
snmp_access allow snmppublic all
snmp_outgoing_address 0.0.0.0
coredump_dir /var/spool/squid/logs
3. create a storeurl.pl
nano /etc/squid/storeurl.pl
then paste this:
#!/usr/bin/perl
$|=1;
while (<>) {
@X = split;
$url = $X[0];
if ($url=~s@^http://(.*?)/videoplayback\\?(.*)id=(.*?)&.*@squid://videos.youtube.INTERNAL/ID=$3@){}
elsif
($url=~s@^http://(.*?)/videoplayback\\?(.*)id=(.*?)@squid://videos.youtube.INTERNAL/ID=$3@){}
elsif
($url=~s@^http://(.*?)/videoplay\\?(.*)id=(.*?)&.*@squid://videos.youtube.INTERNAL/ID=$3@){}
elsif
($url=~s@^http://(.*?)/videoplay\\?(.*)id=(.*?)@squid://videos.youtube.INTERNAL/ID=$3@){}
elsif
($url=~s@^http://(.*?)/get_video\\?(.*)video_id=(.*?)&.*@squid://videos.youtube.INTERNAL/ID=$3@){}
elsif
($url=~s@^http://(.*?)/get_video\\?(.*)video_id=(.*?)@squid://videos.youtube.INTERNAL/ID=$3@){}
elsif
($url=~s@^http://(.*?)rapidshare(.*?)/files/(.*?)/(.*?)/(.*?)@squid://files.rapidshare.INTERNAL/$5@){}
elsif
($url=~s@^http://(.*?)fbcdn\\.net/(.*?)/(.*?)/(.*?\\.jpg)@squid://files.facebook.INTERNAL/$4@){}
elsif
($url=~s@^http://contenidos2(.*?)/(.*?)@squid://files.contenidos2.INTERNAL/$2@){}
elsif
($url=~s@^http://cdn(.*?)/([0-9a-zA-Z_-]*?\\.flv)@squid://files.cdn.INTERNAL/$2@){}
elsif
($url=~s@^http://web.vxv.com/data/media/(.*?)@squid://files.vxv.INTERNAL/$1@){}
elsif
($url=~s@^http://(.*?)megaupload\\.com/files/(.*?)/(.*?)@squid://files.megaupload.INTERNAL/$3@){}
elsif
($url=~s@^http://(.*?)mediafire\\.com/(.*?)/(.*?)@squid://files.megaupload.INTERNAL/$3@){}
elsif
($url=~s@^http://(.*?)depositfiles\\.com/(.*?)/(.*?)/(.*?)@squid://files.megaupload.INTERNAL/$4@){}
print \"$url\\n\"; }
4. now let\'s make storeurl executable.
chmod 755 /etc/squid/storeurl.pl
5. create a transparent firewall script to run on startup
nano /etc/init.d/transparent
paste this:
#!/bin/bash
# squid server IP
SQUID_SERVER=\"192.168.1.1\"
# Interface connected to Internet
INTERNET=\"eth0\"
# Interface connected to LAN
LAN_IN=\"eth1\"
# Squid port
SQUID_PORT=\"3128\"
# DO NOT MODIFY BELOW
# Clean old firewall
iptables -F
iptables -X
iptables -t nat -F
iptables -t nat -X
iptables -t mangle -F
iptables -t mangle -X
# Load IPTABLES modules for NAT and IP conntrack support
modprobe ip_conntrack
modprobe ip_conntrack_ftp
# For win xp ftp client
#modprobe ip_nat_ftp
echo 1 > /proc/sys/net/ipv4/ip_forward
# Setting default filter policy
iptables -P INPUT DROP
iptables -P OUTPUT ACCEPT
# Unlimited access to loop back
iptables -A INPUT -i lo -j ACCEPT
iptables -A OUTPUT -o lo -j ACCEPT
# Allow UDP, DNS and Passive FTP
iptables -A INPUT -i $INTERNET -m state --state ESTABLISHED,RELATED -j ACCEPT
# set this system as a router for Rest of LAN
iptables --table nat --append POSTROUTING --out-interface $INTERNET -j MASQUERADE
iptables --append FORWARD --in-interface $LAN_IN -j ACCEPT
# unlimited access to LAN
iptables -A INPUT -i $LAN_IN -j ACCEPT
iptables -A OUTPUT -o $LAN_IN -j ACCEPT
# DNAT port 80 request comming from LAN systems to squid 3128 ($SQUID_PORT) aka transparent proxy
iptables -t nat -A PREROUTING -i $LAN_IN -p tcp --dport 80 -j DNAT --to $SQUID_SERVER:$SQUID_PORT
# if it is same system
iptables -t nat -A PREROUTING -i $INTERNET -p tcp --dport 80 -j REDIRECT --to-port $SQUID_PORT
# DROP everything and Log it
iptables -A INPUT -j LOG
iptables -A INPUT -j DROP
6. save and exit. now we will make it executable and include it at startup
chmod 755 /etc/init.d/transparent
update-rc.d transparent default
7. create a startup script for squid as well.
nano /etc/init.d/squid
paste this:
#!/bin/bash
squid
8. save and close. then make it executable. then add it to startup.
chmod 755 /etc/init.d/squid
update-rc.d squid defaults
9. create squid cache directory
squid -z
10. run squid in debug mode. you can ctrl+c to close squid in debug mode. at this point you can start to watch a video and see if it caches
squid -NCd1
11. start squid
squid
or you can
/etc/init.d/squid
12. closing squid
squid -k kill
or
squid -k shutdown
13. checking iptables
iptables -L
Some notes. Watching a video in IE then watching it again in firefox causes problems sometimes. you have to force reload or refresh.
A cached video using firefox will not load when using IE.
Compatible browsers i\'ve tested with the cached video are browsers based on firefox and webkit sources. so yes, Firefox, Safari, Chrome, Opera, K-Meleon and Flock and many others except IE. What i did is i got rid of all IE and IE shortcuts from our school PC\'s. :D
Why do this?
1. I did this to learn and test stuff from blogs. I am well aware of pfsense and lusca-cache.
2. I found pfsense and lusca-cache to be sluggish at times and wanted video caching that works with the old setup(Ubuntu/Debian+Squid) i made, which for me looked quite faster in serving content compared to pf+squid. but then its just me... :)
3. The refresh patterns are mainly for youtube and other specific sites, you can add your own refresh patterns too.
-- edited by RobertOppenheimer on Aug 19 2011, 01:41 PM -
2. I found pfsense and lusca-cache to be sluggish at times and wanted video caching that works with the old setup(Ubuntu/Debian+Squid) i made, which for me looked quite faster in serving content compared to pf+squid. but then its just me... :)
i highly agree with this.
though we managed to tweak some settings in pf to overcome most problems... -
^
basically, I\'ve been observing lusca with pfsense, I have a feeling setting a dedicated squid armed to a current network will be the best solution. and leave pfsense as multi-WAN router Load Balancing with failover and IDS/IPS also...
....
thanks sa post, makakatulong to sa marami.
..... -
ah, di lang pala ako nakaobserve. +1 sir ntoskrnl.
np sir rayback. i already have a setup that you described here at school.
i removed lusca from pfsense box, then used it for Multi-WAN Load Balancing. then i setup separate debian box for computer labs with videocache. im currently experimenting with lusca r14940 on a separate debian box. wala na rin akong available switches kaya eto, crossover cable na lang gamit ko. :D -
lusca-cache r14940 works on debian too. i just tested few minutes ago.
same setup as above. will only note changes on how to get latest sources.
1. install svn
apt-get install subversion
2. get lusca sources
svn checkout http://lusca-cache.googlecode.com/svn/branches/LUSCA_HEAD/ lusca-cache
3. the files are automatically saved at /root/lusca-cache assuming you are logged in as root. Now we\'ll build it.
cd /root/lusca-cache
./bootstrap
./configure --prefix=/usr --exec_prefix=/usr --bindir=/usr/sbin --sbindir=/usr/sbin --libexecdir=/usr/lib/squid --sysconfdir=/etc/squid --localstatedir=/var/spool/squid --datadir=/usr/share/squid --enable-async-io --with-pthreads --enable-ssl --enable-storeio=aufs,coss,null --enable-linux-netfilter --enable-arp-acl --enable-epoll --enable-removal-policies=lru,heap --enable-snmp --enable-delay-pools --enable-htcp --enable-cache-digests --enable-referer-log --enable-useragent-log --enable-auth=basic,digest,ntlm,negotiate --enable-http-violations --enable-follow-x-forwarded-for --with-large-files --with-maxfd=65536
4. make && make install
5. Now the squid.conf and storeurl.pl. these files were taken from pfsense stock installation of lusca. i made some changes on mine, feel free to make your own changes too.
squid.conf
# Do not edit manually !
http_port 3128 transparent
icp_port 0
pid_filename /var/spool/squid/logs/squid.pid
cache_effective_user proxy
cache_effective_group proxy
#error_directory /usr/local/etc/squid/errors/English
#icon_directory /usr/local/etc/squid/icons
visible_hostname localhost
cache_mgr admin@localhost
access_log none
cache_log /var/spool/squid/logs/cache.log
cache_store_log none
shutdown_lifetime 0 seconds
uri_whitespace strip
cache_mem 8 MB
maximum_object_size_in_memory 4 KB
memory_replacement_policy heap GDSF
cache_replacement_policy heap LFUDA
cache_dir aufs /var/spool/squid/cache 1024 32 256
minimum_object_size 0 KB
maximum_object_size 64 MB
offline_mode off
# No redirector configured
# Setup some default acls
acl all src 0.0.0.0/0.0.0.0
acl localhost src 127.0.0.1/255.255.255.255
acl localnet src 192.168.1.0/24
acl safeports port 21 70 80 210 280 443 488 563 591 631 777 901 3128 1025-65535
acl sslports port 443 563
acl manager proto cache_object
acl purge method PURGE
acl connect method CONNECT
acl partialcontent_req req_header Range .*
#acl dynamic urlpath_regex cgi-bin \\?
# $Rev$
acl store_rewrite_list urlpath_regex \\/(get_video|videoplayback\\?id|videoplayback.*id) \\.(jp(e?g|e|2)|gif|png|tiff?|bmp|ico|flv|wmv|3gp|mp(4|3)|exe|msi|zip|on2|mar|swf|fid)\\?
acl store_rewrite_list_domain url_regex ^http:\\/\\/([a-zA-Z-]+[0-9-]+)\\.[A-Za-z]*\\.[A-Za-z]*
acl store_rewrite_list_domain url_regex (([a-z]{1,2}[0-9]{1,3})|([0-9]{1,3}[a-z]{1,2}))\\.[a-z]*[0-9]?\\.[a-z]{3}
acl store_rewrite_list_path urlpath_regex \\.(jp(e?g|e|2)|gif|png|tiff?|bmp|ico|flv|avc|zip|mp3|3gp|rar|on2|mar|exe)$
acl store_rewrite_list_domain_CDN url_regex (khm|mt)[0-9]?.google.com streamate.doublepimp.com.*\\.js\\? photos-[a-z].ak.fbcdn.net \\.rapidshare\\.com.*\\/[0-9]*\\/.*\\/[^\\/]* ^http:\\/\\/(www\\.ziddu\\.com.*\\.[^\\/]{3,4})\\/(.*) \\.doubleclick\\.net.* yieldmanager cpxinteractive ^http:\\/\\/[.a-z0-9]*\\.photobucket\\.com.*\\.[a-z]{3}$ quantserve\\.com
#acl rapidurl url_regex \\.rapidshare\\.com.*\\/[0-9]*\\/[0-9]*\\/[^\\/]*
#acl video urlpath_regex \\.((mpeg|ra?m|avi|mp(g|e|4)|mov|divx|asf|qt|wmv|m\\dv|rv|vob|asx|ogm|flv|3gp)(\\?.*)?)$ (get_video\\?|videoplayback\\?|videodownload\\?|\\.flv(\\?.*)?)
#acl html url_regex \\.((html|htm|php|js|css|aspx)(\\?.*)?)$ \\.com\\/$ \\.com$
#acl images urlpath_regex \\.((jp(e?g|e|2)|gif|png|tiff?|bmp|ico)(\\?.*)?)$
acl dontrewrite url_regex redbot\\.org (get_video|videoplayback\\?id|videoplayback.*id).*begin\\=[1-9][0-9]* \\.php\\? threadless.*\\.jpg\\?r=
acl getmethod method GET
storeurl_access deny dontrewrite
storeurl_access deny !getmethod
storeurl_access allow store_rewrite_list_domain_CDN
storeurl_access allow store_rewrite_list
storeurl_access allow store_rewrite_list_domain store_rewrite_list_path
storeurl_access deny all
storeurl_rewrite_program /etc/squid/storeurl.pl
storeurl_rewrite_children 1
storeurl_rewrite_concurrency 30
#acl snmppublic snmp_community public
#cachemgr_passwd none config reconfigure
#work around for fragment videos of msn
acl msnvideo url_regex QualityLevel.*Fragment
http_access deny msnvideo
#always_direct allow html
#cache_peer localhost parent 4001 0 carp login=PASS name=backend-1
max_stale 10 years
# $Rev$
# [email protected]
# 1 year = 525600 mins, 1 month = 43800 mins
refresh_pattern (get_video|videoplayback|videodownload|\\.flv).*(begin|start)\\=[1-9][0-9]* 0 0% 0
refresh_pattern imeem.*\\.flv 0 0% 0 override-lastmod override-expire
refresh_pattern ^ftp: 40320 20% 40320 override-expire reload-into-ims store-stale
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern code.googlec.com.*(svn|download) 0 50% 1440 reload-into-ims
#ads
refresh_pattern ^.*(streamate.doublepimp.com.*\\.js\\?|utm\\.gif|ads\\?|rmxads\\.com|ad\\.z5x\\.net|bh\\.contextweb\\.com|bstats\\.adbrite\\.com|a1\\.interclick\\.com|ad\\.trafficmp\\.com|ads\\.cubics\\.com|ad\\.xtendmedia\\.com|\\.googlesyndication\\.com|advertising\\.com|yieldmanager|game-advertising\\.com|pixel\\.quantserve\\.com|adperium\\.com|doubleclick\\.net|adserving\\.cpxinteractive\\.com|syndication\\.com|media.fastclick.net).* 5259487 20% 5259487 ignore-no-cache ignore-no-store ignore-private override-expire ignore-reload ignore-auth ignore-must-revalidate store-stale negative-ttl=40320 max-stale=1440
#antivirus
refresh_pattern avast.com.*\\.vpx 40320 50% 161280 store-stale reload-into-ims
refresh_pattern (avgate|avira).*\\.(idx|gz)$ 1440 90% 1440 ignore-reload ignore-no-cache ignore-no-store store-stale ignore-must-revalidate
refresh_pattern kaspersky.*\\.avc$ 5259487 999999% 5259487 ignore-reload store-stale
refresh_pattern kaspersky 1440 50% 161280 ignore-no-cache store-stale
refresh_pattern mbamupdates.com.*\\.ref 1440 50% 161280 reload-into-ims store-stale
#specific sites
refresh_pattern \\.rapidshare.*\\/[0-9]*\\/.*\\/[^\\/]* 161280 90% 161280 ignore-reload store-stale
refresh_pattern (get_video\\?|videoplayback\\?|videodownload\\?|\\.flv\\?|\\.fid\\?) 5259487 99999999% 5259487 override-expire ignore-reload store-stale ignore-private negative-ttl=0
refresh_pattern \\.(ico|video-stats) 5259487 999999% 5259487 override-expire ignore-reload ignore-no-cache ignore-no-store ignore-private ignore-auth override-lastmod ignore-must-revalidate negative-ttl=10080 store-stale
refresh_pattern \\.etology\\? 5259487 999999% 5259487 override-expire ignore-reload ignore-no-cache store-stale
refresh_pattern galleries\\.video(\\?|sz) 5259487 999999% 5259487 override-expire ignore-reload ignore-no-cache store-stale
refresh_pattern brazzers\\? 5259487 999999% 5259487 override-expire ignore-reload ignore-no-cache store-stale
refresh_pattern \\.adtology\\? 5259487 999999% 5259487 override-expire ignore-reload ignore-no-cache store-stale
refresh_pattern ^.*safebrowsing.*google 5259487 999999% 5259487 override-expire ignore-reload ignore-no-cache ignore-no-store ignore-private ignore-auth ignore-must-revalidate negative-ttl=10080 store-stale
refresh_pattern ^http://((cbk|mt|khm|mlt)[0-9]?)\\.google\\.co(m|\\.uk) 5259487 999999% 5259487 override-expire ignore-reload store-stale ignore-private negative-ttl=10080
refresh_pattern ytimg\\.com.*\\.(jpg|png) 5259487 999999% 5259487 override-expire ignore-reload store-stale
refresh_pattern images\\.friendster\\.com.*\\.(png|gif) 5259487 999999% 5259487 override-expire ignore-reload store-stale
refresh_pattern ((facebook.com)|(85.131.151.39)).*\\.(png|gif) 5259487 999999% 5259487 override-expire ignore-reload store-stale
refresh_pattern garena\\.com 5259487 999999% 5259487 override-expire reload-into-ims store-stale
refresh_pattern photobucket.*\\.(jp(e?g|e|2)|tiff?|bmp|gif|png) 5259487 999999% 5259487 override-expire ignore-reload store-stale
refresh_pattern vid\\.akm\\.dailymotion\\.com.*\\.on2\\? 5259487 999999% 5259487 ignore-no-cache override-expire override-lastmod store-stale
refresh_pattern .fbcdn.net.*\\.(jpg|gif|png) 5259487 999999% 5259487 ignore-no-cache override-expire ignore-reload store-stale negative-ttl=0
refresh_pattern ^http:\\/\\/images|pics|thumbs[0-9]\\. 5259487 999999% 5259487 ignore-no-cache ignore-no-store ignore-reload override-expire store-stale
refresh_pattern ^http:\\/\\/www.onemanga.com.*\\/ 5259487 999999% 5259487 reload-into-ims override-expire store-stale
refresh_pattern mediafire.com\\/images.*\\.(jp(e?g|e|2)|tiff?|bmp|gif|png) 5259487 999999% 5259487 reload-into-ims override-expire ignore-private store-stale
#general
refresh_pattern \\.(jp(e?g|e|2)|tiff?|bmp|gif|png) 5259487 999999% 5259487 ignore-no-cache ignore-no-store reload-into-ims override-expire ignore-must-revalidate store-stale
refresh_pattern \\.(z(ip|[0-9]{2})|r(ar|[0-9]{2})|jar|bz2|gz|tar|rpm|vpu) 5259487 999999% 5259487 override-expire reload-into-ims
refresh_pattern \\.(mp3|wav|og(g|a)|flac|midi?|rm|aac|wma|mka|ape) 5259487 999999% 5259487 override-expire reload-into-ims ignore-reload
refresh_pattern \\.(exe|msi|dmg|bin|xpi|iso|swf|mar|psf|cab) 5259487 999999% 5259487 override-expire reload-into-ims ignore-no-cache ignore-must-revalidate
refresh_pattern \\.(mpeg|ra?m|avi|mp(g|e|4)|mov|divx|asf|wmv|m\\dv|rv|vob|asx|ogm|flv|3gp|on2) 5259487 9999999% 5259487 override-expire reload-into-ims
refresh_pattern -i (cgi-bin) 0 0% 0
refresh_pattern \\.(php|jsp|cgi|asx)\\? 0 0% 0
refresh_pattern . 0 50% 161280 store-stale
#acl shoutcast rep_header X-HTTP09-First-Line ^ICY.[0-9]
#upgrade_http0.9 deny shoutcast
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
#read_ahead_gap 0 KB
#ie_refresh on
reload_into_ims on
strip_query_terms off
deny_info TCP_RESET localnet
negative_dns_ttl 1 second
negative_ttl 1 second
#snmp_port 3401
#snmp_access allow snmppublic all
maximum_single_addr_tries 2
retry_on_error on
n_aiops_threads 64
#request_header_max_size 128 KB
#reply_header_max_size 128 KB
#range_offset_limit 10 MB
vary_ignore_expire on
#client_db off # this needs to be on for acl maxconn to work
ipcache_size 4096
fqdncache_size 20
#tcp_recv_bufsize 64 KB
pipeline_prefetch on
#half_closed_clients off
# 0x10 no delay, 0x08 throughput, 0x04 reliability
# 0x10 10000 (minimize delay) Use delay metric
# 0x08 01000 (maximize throughput) Use default metric
# 0x04 00100 (maximize reliability) Use reliability metric
# 0x02 00010 (minimize monetary cost) Use cost metric
# dscp squidtos+ECN
# 56 0xE0 11100000
# 48 0xc0 11000000
# 08 0x20 00100000
# 32 0x80 10000000
# 16 0x40 01000000
#tcp_outgoing_tos 0x03 video
#tcp_outgoing_tos 0xb8 html
#tcp_outgoing_tos 0x20 images
#tcp_outgoing_tos 0x02 all
#zph_mode tos
#zph_local 0xb8
#zph_parent 0x08
#acl monitor url_regex avira
#logformat chudy %ts.%03tu %6tr %>a %Ss/%03Hs %<st %rm %ru %mt http%rv Rq[%>h] Rp[%<h]
#access_log /var/squid/log/access2.log chudy monitor
#buffered_logs on
#download_fastest_client_speed on
#acl text rep_header Content-Type -i text\\/
#acl hit rep_header X-Cache -i hit
#acl partial rep_header Content-Range .*
#log_access deny partial
#log_access deny php
#log_access deny text
#log_access deny hit
#log_access deny html
#log_access deny !getmethod
high_page_fault_warning 50
#log_access deny manager
#cache deny dynamic
http_access allow manager localhost
http_access deny manager
http_access allow purge localhost
http_access deny purge
http_access deny !safeports
http_access deny CONNECT !sslports
# Always allow localhost connections
http_access allow localhost
http_access allow localnet
# Allow local network(s) on interface(s)
# Default block all to be sure
http_access deny all
storeurl.pl
#!/usr/bin/perl
# $Rev$
# by [email protected]
# Youtube updates at http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube/Discussion
$|=1;
while (<>) {
@X = split;
# $X[1] =~ s/&sig=.*//;
$x = $X[0] . \" \";
$_ = $X[1];
$u = $X[1];
#photos-X.ak.fbcdn.net where X a-z
if (m/^http:\\/\\/photos-[a-z]?(.ak.fbcdn.net.*)/) {
print $x . \"http://photos\" . $1 . \"\\n\";
#maps.google.com
} elsif (m/^http:\\/\\/(khm|mt)[0-9]?(.google.com.*)/) {
print $x . \"http://\" . $1 . $2 . \"\\n\";
# compatibility for old cached get_video?video_id
} elsif (m/^http:\\/\\/([0-9.]{4}|.*\\.youtube\\.com|.*\\.googlevideo\\.com|.*\\.video\\.google\\.com).*?(videoplayback\\?id=.*?|video_id=.*?)\\&(.*?)/) {
$z = $2; $z =~ s/video_id=/get_video?video_id=/;
print $x . \"http://video-srv.youtube.com.SQUIDINTERNAL/\" . $z . \"\\n\";
# youtube 1024p HD itag=37, 720p HD itag=22
} elsif (m/^http:\\/\\/([0-9.]{4}|.*\\.youtube\\.com|.*\\.googlevideo\\.com|.*\\.video\\.google\\.com).*?\\&(itag=37|itag=22).*?\\&(id=[a-zA-Z0-9]*)/) {
print $x . \"http://video-srv.youtube.com.SQUIDINTERNAL/\" . $2 . \"&\" . $3 . \"\\n\";
# youtube 360p itag=34 ,480p itag=35 and others
} elsif (m/^http:\\/\\/([0-9.]{4}|.*\\.youtube\\.com|.*\\.googlevideo\\.com|.*\\.video\\.google\\.com).*?\\&(itag=[0-9]*).*?\\&(id=[a-zA-Z0-9]*)/) {
print $x . \"http://video-srv.youtube.com.SQUIDINTERNAL/\" . $3 . \"\\n\";
} elsif (m/^http:\\/\\/www\\.google-analytics\\.com\\/__utm\\.gif\\?.*/) {
print $x . \"http://www.google-analytics.com/__utm.gif\\n\";
#Cache High Latency Ads
} elsif (m/^http:\\/\\/([a-z0-9.]*)(\\.doubleclick\\.net|\\.quantserve\\.com|\\.googlesyndication\\.com|yieldmanager|cpxinteractive)(.*)/) {
$y = $3;$z = $2;
for ($y) {
s/pixel;.*/pixel/;
s/activity;.*/activity/;
s/(imgad[^&]*).*/\\1/;
s/;ord=[?0-9]*//;
s/;×tamp=[0-9]*//;
s/[&?]correlator=[0-9]*//;
s/&cookie=[^&]*//;
s/&ga_hid=[^&]*//;
s/&ga_vid=[^&]*//;
s/&ga_sid=[^&]*//;
# s/&prev_slotnames=[^&]*//
# s/&u_his=[^&]*//;
s/&dt=[^&]*//;
s/&dtd=[^&]*//;
s/&lmt=[^&]*//;
s/(&alternate_ad_url=http%3A%2F%2F[^(%2F)]*)[^&]*/\\1/;
s/(&url=http%3A%2F%2F[^(%2F)]*)[^&]*/\\1/;
s/(&ref=http%3A%2F%2F[^(%2F)]*)[^&]*/\\1/;
s/(&cookie=http%3A%2F%2F[^(%2F)]*)[^&]*/\\1/;
s/[;&?]ord=[?0-9]*//;
s/[;&]mpvid=[^&;]*//;
s/&xpc=[^&]*//;
# yieldmanager
s/\\?clickTag=[^&]*//;
s/&u=[^&]*//;
s/&slotname=[^&]*//;
s/&page_slots=[^&]*//;
}
print $x . \"http://\" . $1 . $2 . $y . \"\\n\";
#cache high latency ads
} elsif (m/^http:\\/\\/(.*?)\\/(ads)\\?(.*?)/) {
print $x . \"http://\" . $1 . \"/\" . $2 . \"\\n\";
} elsif (m/^http:\\/\\/(www\\.ziddu\\.com.*\\.[^\\/]{3,4})\\/(.*?)/) {
print $x . \"http://\" . $1 . \"\\n\";
#cdn, varialble 1st path
} elsif (($u =~ /filehippo/) && (m/^http:\\/\\/(.*?)\\.(.*?)\\/(.*?)\\/(.*)\\.([a-z0-9]{3,4})(\\?.*)?/)) {
@y = ($1,$2,$4,$5);
$y[0] =~ s/[a-z0-9]{2,5}/cdn./;
print $x . \"http://\" . $y[0] . $y[1] . \"/\" . $y[2] . \".\" . $y[3] . \"\\n\";
#rapidshare
} elsif (($u =~ /rapidshare/) && (m/^http:\\/\\/(([A-Za-z]+[0-9-.]+)*?)([a-z]*\\.[^\\/]{3}\\/[a-z]*\\/[0-9]*)\\/(.*?)\\/([^\\/\\?\\&]{4,})$/)) {
print $x . \"http://cdn.\" . $3 . \"/SQUIDINTERNAL/\" . $5 . \"\\n\";
} elsif (($u =~ /maxporn/) && (m/^http:\\/\\/([^\\/]*?)\\/(.*?)\\/([^\\/]*?)(\\?.*)?$/)) {
print $x . \"http://\" . $1 . \"/SQUIDINTERNAL/\" . $3 . \"\\n\";
#domain/path/.*/path/filename
} elsif (($u =~ /fucktube/) && (m/^http:\\/\\/(.*?)(\\.[^\\.\\-]*?[^\\/]*\\/[^\\/]*)\\/(.*)\\/([^\\/]*)\\/([^\\/\\?\\&]*)\\.([^\\/\\?\\&]{3,4})(\\?.*?)$/)) {
@y = ($1,$2,$4,$5,$6);
$y[0] =~ s/(([a-zA-A]+[0-9]+(-[a-zA-Z])?$)|([^\\.]*cdn[^\\.]*)|([^\\.]*cache[^\\.]*))/cdn/;
print $x . \"http://\" . $y[0] . $y[1] . \"/\" . $y[2] . \"/\" . $y[3] . \".\" . $y[4] . \"\\n\";
#like porn hub variables url and center part of the path, filename etention 3 or 4 with or without ? at the end
} elsif (($u =~ /tube8|pornhub|xvideos/) && (m/^http:\\/\\/(([A-Za-z]+[0-9-.]+)*?(\\.[a-z]*)?)\\.([a-z]*[0-9]?\\.[^\\/]{3}\\/[a-z]*)(.*?)((\\/[a-z]*)?(\\/[^\\/]*){4}\\.[^\\/\\?]{3,4})(\\?.*)?$/)) {
print $x . \"http://cdn.\" . $4 . $6 . \"\\n\";
#for yimg.com video
} elsif (m/^http:\\/\\/(.*yimg.com)\\/\\/(.*)\\/([^\\/\\?\\&]*\\/[^\\/\\?\\&]*\\.[^\\/\\?\\&]{3,4})(\\?.*)?$/) {
print $x . \"http://cdn.yimg.com//\" . $3 . \"\\n\";
#for yimg.com doubled
} elsif (m/^http:\\/\\/(.*?)\\.yimg\\.com\\/(.*?)\\.yimg\\.com\\/(.*?)\\?(.*)/) {
print $x . \"http://cdn.yimg.com/\" . $3 . \"\\n\";
#for yimg.com with &sig=
} elsif (m/^http:\\/\\/([^\\.]*)\\.yimg\\.com\\/(.*)/) {
@y = ($1,$2);
$y[0] =~ s/[a-z]+([0-9]+)?/cdn/;
$y[1] =~ s/&sig=.*//;
print $x . \"http://\" . $y[0] . \".yimg.com/\" . $y[1] . \"\\n\";
#youjizz. We use only domain and filename
} elsif (($u =~ /media[0-9]{1,5}\\.youjizz/) && (m/^http:\\/\\/(.*?)(\\.[^\\.\\-]*?\\.[^\\/]*)\\/(.*)\\/([^\\/\\?\\&]*)\\.([^\\/\\?\\&]{3,4})(\\?.*?)$/)) {
@y = ($1,$2,$4,$5);
$y[0] =~ s/(([a-zA-A]+[0-9]+(-[a-zA-Z])?$)|([^\\.]*cdn[^\\.]*)|([^\\.]*cache[^\\.]*))/cdn/;
print $x . \"http://\" . $y[0] . $y[1] . \"/\" . $y[2] . \".\" . $y[3] . \"\\n\";
#general purpose for cdn servers. add above your specific servers.
} elsif (m/^http:\\/\\/([0-9.]*?)\\/\\/(.*?)\\.(.*)\\?(.*?)/) {
print $x . \"http://squid-cdn-url//\" . $2 . \".\" . $3 . \"\\n\";
} elsif (m/^http:\\/\\/(.*?)(\\.[^\\.\\-]*?\\..*?)\\/([^\\?\\&\\=]*)\\.([\\w\\d]{2,4})\\??.*$/) {
@y = ($1,$2,$3,$4);
$y[0] =~ s/([a-z][0-9][a-z]dlod[\\d]{3})|((cache|cdn)[-\\d]*)|([a-zA-A]+-?[0-9]+(-[a-zA-Z]*)?)/cdn/;
print $x . \"storeurl://\" . $y[0] . $y[1] . \"/\" . $y[2] . \".\" . $y[3] . \"\\n\";
# all that ends with ;
} elsif (m/^http:\\/\\/(.*?)\\/(.*?)\\;(.*)/) {
print $x . \"http://\" . $1 . \"/\" . $2 . \"\\n\";
} else {
print $x . $_ . \"\\n\";
}
}
As you can see chudy fernandez\'s name is still retained. credits for the script and squid.conf files goes to him. Happy caching :D -
hello and thank you zabedinasis. https sites fall under ssl, sadly i have not tried experimenting on it yet, but the compile option --enable-ssl should be more than enough to enable us to do that. refresh patterns lang saka acl as well as storeurl entry. hopefully i can start with that in the next few days. :D
-
posible un sir, as long as may acl, refresh pattern, hmm yes iptables rules redirect ssl port to squid https_port, then acl storeurl rewrite rules then storeurl rewrite entry. kadalasan kc kahit na https young site and video ang content, mataas probability na gumagamit sila ng load balanced multiple servers for content distribution just like youtube, buti na lang may support sa collapsed forwarding si squid 2.7 and lusca-head. so yes is the short answer, mahirap lang talaga kapain kung papaano dahil kelangan natin aralin gnu_regex perl and squid documentation. also makakatulong din kung makapagsulat tayo ng custom rewriter in native languages and not interpreted para efficient. may available na rewriter for squid na mabilis. jesred pangalan nung program. itetest ko rin yun later on. pakiwari ko e medyo mabagal po kung plain perl lang gagamitin natin. aralin ko muna regex then susubukan ko yung jesred.
-
Bookmarked! thanks for the share...
-
Now a thought comes to mind...
lusca-cache videocache only
+
lusca-cache or squid3 for static web contents
this setup uses lusca-cache as cache_peer parent of squid3. one of the 2 instances of squid would use a different squid.conf file this can be done by using \"squid -f /etc/squid/squid2.conf\" and of course its going to use a different directory for log files and cache. will post a tut if i get it to work.
I also notice that the stock configuration of lusca in pfsense caches ads. Imagine 30 ad images on a single page. all 30 of them would require a rewrite child or a concurrency which is too high a price to pay for pulling up a single page..Block ads? or is there a better, faster and more efficient way to cache those high latency ads? -
Ad Zapping with Squid <click here for link>
-
Hello again ^_^
2 instances of squid LUSCA_HEAD r14940 tested. it works pretty well although its not as quite as stable as a properly configured squid.
at some point i encountered endless loop request, it might be because i didn\'t implement anti-self ddos acl in both instances. but i\'m gonna give it a rest for a while
Test rig is still using debian a slightly modified setting for squid.conf and storeurl rewriter.
i reverted back to single squid(LUSCA_HEAD) setup and tried to install squidGuard. Take note that you cannot do an apt-get install squidGuard because it will force apt to download squid which will totally screw-up your original settings.
Here\'s how we can install squidGuard on an existing installation. Yep, you guessed it right! Install from source.
1. Install squidGuard requirements. squidGuard has errors with version db4.7 so we\'ll install version 4.6
apt-get install db4.6 flex bison
2. grab the squidGuard source, i got version 1.4. if you want to experiment you can try the beta version 1.5
cd /usr/src
wget http://squidguard.org/Downloads/squidGuard-1.4.tar.gz
tar -xvzf squidGuard*
3. build squidGuard
cd /usr/src/squidGuard*
./configure --prefix=/usr --bindir=/usr/sbin --sbindir=/usr/sbin --sysconfdir=/etc/squid --localstatedir=/var/squid --with-db-lib=/usr/lib --with-squid-user=proxy
make && make install
4. Configure squidguard but before doing that lets list the directories and download the blacklist
squidGuard.conf is at /usr/local/squidGuard
blocklist files are at /usr/local/squidGuard/db
cd /usr/local/squidGuard/db
wget http://www.shallalist.de/Downloads/shallalist.tar.gz
tar -xvzf shallalist.tar.gz
chown -R proxy:proxy BL
Now lets edit squidGuard.conf
nano /usr/local/squidGuard/squidGuard.conf
change dbhome to /usr/local/squidGuard/db/BL
here\'s a sample squidGuard.conf that blocks ads and redirect them to facebook page. yeah i kinda hate ads, i bet you do too. :D
#squidGuard configuration
dbhome /usr/local/squidGuard/db/BL
logdir /usr/local/squidGuard/log
dest adv {
domainlist adv/domains
urllist adv/urls
}
acl {
default {
pass !adv all
redirect http://www.facebook.com/
}
}
5. Next we\'ll compile the url and domain database blacklist and add squidGuard to squid as redirect program
squidGuard -d -C all
nano /etc/squid/squid.conf
Add this lines to your squid.conf file
url_rewrite_program /usr/bin/squidGuard -c /usr/local/squidGuard/squidGuard.conf
url_rewrite_children 3
save it and restart squid
squid -k shutdown
squid
or squid -NCd1 to check squid for errors. if everything went smoothly and it blocks ads then you can ctrl+c and run squid again
also you can edit storeurl.pl and squid acl\'s to remove caching high latency ads.
I havent started experimenting on caching online game updates but i might be doing that very soon. Well if my sked is free and i got a few days to spare.
In the meantime you can do this to disable caching of game patches
acl error_game_updates dstdomain \"/etc/squid/error_games\"
acl anti_self_ddos myip <squidbox-ip>
cache deny error_game_updates anti-self-ddos
cache allow all
create a file to list your game update sites.
nano /etc/squid/error_games
#error_games file
123.242.206.79 #This is Pointblank patch update site
save and restart squid.
Notes:
Q: Why not use url_rewrite_concurrency? Its way faster than url_rewrite_children yah dimwit.
A: squidGuard is an old style url rewrite program and does not have support for concurrency yet. It gave me errors and crashed squid.
Q: Why does storeurl_rewrite_program script works with concurrency option?
A: It\'s not the script that is concurrency compliant but its actually the interpreter program that is compliant. perl supports concurrency. :D
happy caching
-- edited by RobertOppenheimer on Aug 21 2011, 04:11 PM -
Just a little update on the test rig i setup.
Hardware Specs
Network Controllers:
Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 02)
Atheros Communications Inc. AR5008 Wireless Network Adapter (rev 01)
Sundance Technology Inc / IC Plus Corp IC Plus IP100A Integrated 10/100 Ethernet MAC + PHY (rev 31)
WLAN info:
wlan0 IEEE 802.11bgn Mode:Master Frequency:2.442 GHz Tx-Power=20 dBm
Retry long limit:7 RTS thr:off Fragment thr:off
Power Management:off
mon.wlan0 IEEE 802.11bgn Mode:Monitor Frequency:2.442 GHz Tx-Power=20 dBm
Retry long limit:7 RTS thr:off Fragment thr:off
Power Management:off
Software installed:
Squid lusca_head r14940
squidGuard v1.4
hostapd - for atheros as wireless access point
isc-dhcp-server
bind9 - as local authoritative nameserver
webmin
ssh-server
Speedtest:
squid cache hit <click here for link>
squidGuard blocked url hits:
-- edited by RobertOppenheimer on Aug 22 2011, 02:10 PM -
If at some point, you\'re already happy with your setup, place your server away from the general flow of people.
Then start monitoring it using PuTTy and webmin.
-
Bookmarked.
Thanks for sharing.
:) -
Hello,
@zabedinazis
Yes sir.Computer Laboratory sa morning and Computer shop pag afternoon.
@eccen3k
np. and you\'re welcome.
There is also a nice tutorial at http://www.cyberciti.biz/faq/linux-tcp-tuning/
This increases rmem and wmem size from 128KB to 12MB.
very useful for lanshops with heavy traffic and large files being served..
It will most probably work for freebsd as well for those people using pfsense as caching proxy.
It\'s either that or it has already been applied in pf and m0n0wall distros.. :D -
Yes i tried that too.
I used openssl to create certificates and keys and signed them with self made CA.
used it with squid using https_port 4433 transparent cert=/etc/ssl/foo.cert key=/etc/ssl/foo.key
redirect lan interface port 443 to squid port 4433 and internet interface port 443 to squid ssl port. basically it worked for some https sites
but gave me a lot of headache with e-games account login page and levelup login page. To be fair i did try really hard but it\'s half-baked
In the end i kinda gave up too and later on accepted that doing a man in the middle attack on ssl is problematic.
but who knows.. maybe someday... now i\'m going to try to install l7 filter. i\'m probably going to recompile kernel and not use the userspace l7 filter
a lot to read about iptables..
thanks for the share zabedinazis :D -
musta dito mga sir???
-- edited by CMacoy on Sep 06 2011, 12:49 PM -
@CMacoy
ok lang sir. currently moving to a new linux distro namely gentoo then next is LFS. i have grown to love compiling utilities from source including the kernel.
will post a guide for my projects if anyone wants. also going to try to learn more about hardening *nix systems soon.
so far, the tutorial still stable as youtube has not changed their url\'s yet. -
There is also a nice tutorial at http://www.cyberciti.biz/faq/linux-tcp-tuning/
This increases rmem and wmem size from 128KB to 12MB.
very useful for lanshops with heavy traffic and large files being served..
It will most probably work for freebsd as well for those people using pfsense as caching proxy.
It\'s either that or it has already been applied in pf and m0n0wall distros.. :D
@robertoppenheimer
sir applicable po ba ito sa pfsense 1.2.3.? noob lang po.. -
@zabimaru
thanks. also i\'m currently reading about some articles about freepbx and asterisk for VoIP support. very interesting topic. http://mikeoverip.wordpress.com/2010/09/26/freepbx-2-8-installation-on-debian-5-lenny/
@obiwan
yes you may apply this mod to pfsense if you want. but i wouldnt really mess with the stock settings if you have limited RAM. I\'d memorize or list the values before i change anything too just in case... you can never be too careful :D -
cache_dir ufs /var/spool/squid/cache 2048 32 256
i think aufs file system is better -
@obiwan
yes you may apply this mod to pfsense if you want. but i wouldnt really mess with the stock settings if you have limited RAM. I\'d memorize or list the values before i change anything too just in case... you can never be too careful :D
@robertoppennheimer
sir my pfsense is a standalone with 2 gb ram pwede na ba ito? -
elkabong
true. async mode does work better nevertheless those little things are left for the user to tune according to his needs.
\"when ppl talk abt whether there is a space or none between the int and the *, you get a really furious discussion.\" -bjarne stroustrup
@obiwan0515
pwedeng pwede sir :D
-- edited by RobertOppenheimer on Sep 10 2011, 12:44 AM -
thanks for this very informative thread sir.
nag try ako sa vmware win7 32 bit.
ginamit ko yung upper post.
single nic lang po bale caching proxy lang for the web browser.
bridge connection.
tapos configure other computer sa lan to use vmware ip as proxy.
gumana sya, and na cache yung youtube videos....
mas mabilis sa na try kong pfsense + lusca-cache
ask ko lang po sir,
naka log-in ako sa youtube gamit ang server pc,
tapos nung nagtry ako sa isang client pc para mag browse sa youtube
nag open sya ng mabilis kaya lang account ko pa rin yung naka log-in sa client pc ;p -
papaano ba lagyan ng netlimiter ito o bandwidth sharper??
-
sorry for the late reply, all thanks to typhoon pedring.. :(
@Balwer
I\'m glad you made it work. hmm naka remember siguro username and password mo sa browser sir kaya ganun. :D
@CMacoy
di ko pa na-try bandwidth limiting sir pero we can explore a lot of qos and bandwidth scheaduling in linux, even netfilter l7 userspace :D
cookies here.. http://lartc.org/
-- edited by RobertOppenheimer on Oct 15 2011, 06:29 PM -
sa pagblocked ng site thru squid (ACL) ba gamit nyo sir o iptables? may gui version ba ng iptables?
-
squidGuard gamit ko for blocking sites sir. may limitation nga lang gaya ng mga ssl enabled sites saka kelangan laging i-update definitions. for bandwidth limiting, gumamit ako ng m0n0wall yung may magic shaper. dual wan is now handled by dual wan router. iptables for redirecting all other ports to squid port for example 3127 3124 8080, i used iptables for those, although acl will work as well.
-
minsan kasi sir instead na domain name nilalagay nila sa address ip ang ginagamit kaya nakakalusot :(
-
may option sa squidGuard na i-dedeny nya mga ip sir.