Search Results

Search found 43201 results on 1729 pages for 'wedding web designs'.

Page 766/1729 | < Previous Page | 762 763 764 765 766 767 768 769 770 771 772 773  | Next Page >

  • Using LDAP Attributes to improve performance for large directories

    - by Vineet Bhatia
    We have a LDAP directory with more than 50,000 users in it. LDAP Vendor suggests maximum limit of 40,000 users per LDAP group. We have number of inactive users and those are being purged but what if we don't get below the 40,000 users? Would switching to using multivalued attribute at user record level instead of using LDAP groups yield better performance during authentication, adding new users, etc? I know most server software (portal, application servers, etc) use LDAP groups. But, we have a standardized web service interface for access control instead of relying on server software to map LDAP groups to security roles. Each application uses this common "access control web service". Security roles are used within application to build fine-grained ACL used within each enterprise application.

    Read the article

  • File/folder permissions and groups on Linux with Apache

    - by phobia
    I'm trying to learn about permissions on linux webserver with apache. Some clues to the system: The server I have to play around with is Fedora based. Apache runs as apache:apache. To allow for e.g. php to write to a file the file needs to be chmod 777. 755 is not sufficiant. What I'm wondering is basically how set up permissions like they should be on e.g. a "shared web host". My main problem is that if I set a permission so that one user cannot access anothers home folder, then apache can't read from the public_html folder either. To keep the users out I need to set chmod 700. But to let apache to read I need to have at least execute on world, so a 701 basically works, but won't let some users in. So I'm really stuck on what to do. Have been concidering adding the apache user to the frous grours below to avoid having to add the world execute flag, but is that a bad thing? Should it be the other way around, the users in the groups below should also be in the apache group? I was aiming at having 4 groups: 1. webapp same as dev_int, but is the only one that can go inside the webapp/live folder to e.g. do an update from the repo. 2. dev_int can read,write and execute everything in the "web root", including the two below, but nothing outside of the web root 3. dev_ext can read write and execute in all client folders, but cannot access anything outside of the webapp root 4. clientsBasic ftp accounts. Has a home folder with a public_html, but cannot access any other home folders An example of folder structure: webroot    no users in the aforementioned groups can go outside of here some_project    :dev_int only webapp live    :webapp only staging    :dev_int and :dev_ext clients    :dev_int and :dev_ext client_1    :dev_int, :dev_ext and client1:clients public_html dev developer_1    developer_1:dev_int OR :dev_ext public_html

    Read the article

  • IE session (-nomerge) management application?

    - by skrco
    I'm in need of an application that can manage multiple Internet Explorer instances (to be precise nomerge sessions) like you can in Remote Desktop Manager with RDPs. This app should host them in single window and arrange IE instances e.g. in tabs or lists. OK, in Remote Desktop Manager you can create Web session, but in embedded mode you cannot set the nomerge option - all windows and tabs share the same session. I've been searching the web, but with no results. So I put this question whether anyone know of such application or any workaround. Or I have to write my own app?

    Read the article

  • help with Outlook Exchange server and curl

    - by stib
    I work on a mac in a building full of PCs, and the IT department here doesn't have IMAP access turned on on the exchange servers. So I miss a lot of meetings because I don't get reminders because I access my mail via Outlook Web access. I had written a script to scrape my Outlook Web Access calendar and turn it into iCal format, so I could get my reminders via thunderbird or iCal.app. It basically downloaded the calendar page via curl, parsed the HTML and reformatted all the appointments as ical. it wasn't elegant, but it worked. Then they changed to outlook 2007, and it doesn't work any more. I have a sketchy knowledge of curl, and almost zero knowledge of how outlook works. Can anyone point me towards a reference for getting calendar info out of an exchange server without using outlook? If I can configure curl to get the HTML I will be happy, but if there's a more elegant way, such as getting the calendar info as XML I'll be delirious.

    Read the article

  • Windows HTTP proxy client to pass service requests to VPN

    - by Chris
    I've got access to a network via CheckPoint VPN (Windows client). Problem is, I have a linux box that needs to talk to its web services and the target web servers are inside the VPN. So far, we have been unable to connect linux to the VPN (and I'm not trying to solve that problem at the moment). I'm wondering if (temporarily) I can setup a proxy server on a Windows (XP) box to shuttle HTTP requests back and forth? If so, what'd be a good application to do this? (hopefully free/open-source) TIA

    Read the article

  • problem with crating table on phpMyAdmin database

    - by tombull89
    Hello all, I'm running a phpMyAdmin Database on my web package on a 1and1-hosted server. I've managed to set up a database in the control panel, have uploaded all to root/phpmyadmin and changed the config.ini.php file to point at 1and1's database server (because that's the way they do it). I can go to the web interface and get to the main page, but all it shows is the database name and I can't find how to create any tables. I know it's a long shot but I'm almost out of ideas. Also, 1and1 have their own phpmyadmin panel, which is pretty annoying to use, and a 1and1 webdatabase which I have barely looked at. Help and suggestions much appriciated.

    Read the article

  • IIS: changing site's home directory while site is running

    - by Jeff Stewart
    I'm trying to understand exactly what IIS 6.0 (on Windows Server 2003) does when I change the "Local Path" of a web site's Home Directory while the site is running. (Specifically with regard to ASP.NET applications.) I'm trying to build support for or against this practice in a deployment scenario: e.g. deploy the new code alongside the old code, then simply switch the IIS web site's local path to the folder containing the new code. IIS seems to handle this gracefully, but I notice that w3wp.exe still keeps some handles on the old code folder after the change. That's strange to me, because I would have expected IIS to recycle the application pool if this happened. Is this safe? Is the behavior well-defined?

    Read the article

  • Kerberos: connection from win app running from IIS to SQL failed

    - by Mikhail Kislitsyn
    I have an IIS web-application with Windows authentication and impersonation. This application connects to SQL server. In this case Kerberos works fine. But there is a problem. Web-application runs windows application (not .NET), which also connects to the SQL server. Windows application runs with IIS app user credentials and impersonates current site user to connect to SQL server. scheme: http://i.stack.imgur.com/2cgv7.png When delegation for IIS user is set to "Trust this computer for delegation to any service" everything works fine. But I can't use this type of delegation according to security requirements. When I set delegation to "Specific services" and choose MSSQLSvc SPN, connection from windows application fails with "ANONIMOUS" fault. WireShark shows "KRB5KDC_ERR_BADOPTION" packet. What I'm doing wrong?

    Read the article

  • Are Plesk server backups useful?

    - by Michael T. Smith
    I'm working for a startup now, and I'm the programmer. Because of our small team size, I'm also handling the server management for now (until we get a dedicated server administrator.) I've never used Plesk before, and the server we're using (a Media Temple Dedicated Virtual server) had it installed when I got here. One of my first jobs was to set up backups: Plesk was already running it's nightly server-wide backups. I created a small script to dump the web app, it's DBs and any assets, tar them, store them, and then copy them to another small server we have (to backup the backups.) But, we're constantly running into hard drive space issues because of the Plesk backups. And I'm wondering, are they useful? If I have the web app and all of it's assets, I could easily enough get another server up and running. Do we need to keep running Plesk's backups? Thoughts?

    Read the article

  • Is it possible to add files to the "Wordpress Media Library" using the command line?

    - by Tom
    Wordpress has it's own "Media Library" which is used when you upload images and other media for use in blog posts and pages. The advantage of the media library is that it automatically produces thumbnails of the images and the web interface gives you extra info such as who uploaded the image, which articles use the image, etc. My question is, does anyone have any tips on interacting with the media library via the command line instead of using the Wordpress web interface? For example, any ideas on how to add a image to the media library from the command line? If I copy files to the media library directory (usually .../wp-content/uploads/YYYY/MM/) from the command line they do not show up in the Wordpress dashboard - I guess because there needs to be an associated database entry for the media to be registered with Wordpress.

    Read the article

  • Running phpmyadmin and suphp

    - by thor
    I have a Debian Lenny web server. It is running apache2 with libapache2-mod-suphp. Unfortunately, suphp makes impossible to use phpmyadmin, as phpmyadmin is installed in /usr/share/phpmyadmin and owned by root, and suphp disables it's enging in this direcory: $ cat /etc/apache2/mods-enabled/suphp.conf <IfModule mod_suphp.c> AddType application/x-httpd-php .php .php3 .php4 .php5 .phtml suPHP_AddHandler application/x-httpd-php <Directory /> suPHP_Engine on </Directory> # By default, disable suPHP for debian packaged web applications as files # are owned by root and cannot be executed by suPHP because of min_uid. <Directory /usr/share> suPHP_Engine off </Directory> </IfModule> Is there a possibility to enable system phpmyadmin (may be through standard libapache2-mod-php5) while using suphp? How?

    Read the article

  • Changing default datadirectory to one on a External NAS or add external share

    - by Hagbart Celine
    So I have been searching the web for days, looking for a solution to my problem. Now my only hope are you guys. I have installed Owncloud successfully on my Windows Server 2008R2. It all runs smoothly and I can connect without problems. So first checks are OK. Now I wanted to change the default data directory from my server to a shared folder on my NAS (Synology DS1813+, DSM 5.0-4493 Update 3). Tried following: changing the directory in config.php I changed the path in the config file from : "C:\inetpub\wwwroot\myfolder\data" to "\NASIP\cloud". by doing this the owncloud server only shows: Code: Select all Daten-Verzeichnis (\192.168.2.4\Cloud\data) ist ungültig Bitte stelle sicher, dass das Daten-Verzeichnis eine Datei namens ".ocdata" im Wurzelverzeichnis enthält. I also tried coping the files that were created in the local data storage, to the share on the NAS. No Luck. Now I tried it by mapping a network drive and using that in the config.php But still no luck. I get the same message with the missing .ocdata file. Now I tried the "External Storage APP" that comes with owncloud I thought that at least I could add the share as an external storage. But this also does not work. tried UNC, Mapped Drive Name (Z:) but nothing helped. So now I'm turning to you.. Does anyone have expirience with this kind of setup? Or can you even tell me how to make it work? (default or external storgae, I don't care anymore ) Using NAS (Synology DS1813+, DSM 5.0-4493 Update 3), Owncloud 7, Windows Server 2008 R2, IIS7 I got an answer on an other forum: The second option is how it should be done: 1. Put OC in maintenance mode 2. Mount (mapping in the windows world) your NAS directly to your OS 3. Copy the local data directory to the NAS mount 4. Ensure the permission is setup to give the web user access to the NAS mount 5. Update OC config.php with the new data path 6. Disable OC maintenance mode And this seems like the right way.. Ensure the permission is setup to give the web user access to the NAS mount I guess this is where I am not sure. What user is it exactly on my Server that is making the requests to the NAS? If the user is for example "IUSR" I can just create an account on my synology NAS and give him full access to my share? (But what is IUSRs password?) I have full root ssh access to my NAS, so if you can tell me what chmod or chown I need to use on my cloud folder...

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Picture syncing on Multiple Macs, iPhones, and iPad together so each device can update them all

    - by cohortq
    Hello! One of the owners of my company has put me to task to sync his pictures between the following devices together. (2) iPhones (2) iMacs (1) Macbook Air (1) iPad Here is what is happening 1) He has a camera that can upload pictures into iPhoto in either (1) of his iMacs, or Macbook Air. 2) He has (2) different iPhones. And here are how they are paired up iPhone - iMac Home iPhone - Macbook Air 3) He has MobileMe syncing Calendar, Contacts, and Notes across all devices 4) Currently we are using MobileMe web galleries to sync all photos, by having ME create each album and upload them to the MobileMe web gallery. Not the problem is. He wants to just take pictures, and once he does that it syncs with all his devices, he'll even dock the iPad. Is there a better way to sync photos between all devices?

    Read the article

  • Visual SVN server Running but cannot access / browse repositories

    - by user1783560
    Operating System: Windows Web Server 2008 R2 Visual SVN Version: 2.5.7 Subversion: 1.7.7 Apache: 2.2.22 I freshly installed the Visual SVN latest version on the server and created one repository in it. In the server management window, it shows that the server is up and running but when I try to browse it in a web browser, it doesn't respond. I am not able to import my existing code into the repository: Error: Cannot connect to server open/browse the repository with either command localhost:81/svn OR http://www.myserver.com:81/svn OR http:// myIPAddress:81/svn Visual SVN log is clean. The last information in the server log is that "The server is listening to port 81.

    Read the article

  • Admin mode on Procurve Switches

    - by stefan.at.wpf
    Not being a network expert, I spent some time configuring my network, until I found my mistake: On my HP ProCurve Switch 1810G, I thought that "Admin mode" means whether the administrative interface can be accessed from this port. Well, it means whether the port is enabled or not. Extract from the help function: Admin Mode - Select to enable the port-control administration state. Click to enable and have the port participate in the network.(Default : Enabled ) Well, of course I didn't read the help, because I didn't doubt it's for the adminsitrative interface and suspected an error somewhere else. Anyway, I am wondering if that is a commonly used term for enabling/disabling ports or if HP just wanted to make my life harder? I can't understand why this option isn't just called "Enable port"!? Here's a screenshot of how it looks in the web interface (yeah, shame on me for using a web interface)

    Read the article

  • Access Other Computer via ADSL Modem and Router

    - by Sohail
    My network configuration is like this ISP - Modem - Router - Computer and the same goes for my friend. We want to share large files or to access each other's computer. The Passwords to login to the web page of each other modems are known. I mean, we can access each other Modem Web GUI easily ( because we know the WAN IP of each other modem ). The question is how do i access his or how do he access my computer? Is there any way to do so. Remote desktop connection is not working :( Please help

    Read the article

  • Security issues of running PHP scripts as the owner of the PHP file with suexec

    - by thomasrutter
    I'm using suexec to ensure that PHP scripts (and other CGI/FastCGI apps) are run as the account holder associated with the relevant virtual host. This allows for securing each users' scripts from reading/writing by other users. However, it occurs to me that this opens up a different security hole. Previously, the web server ran as an unprivileged user, with read-only access to user's files (unless the user changed the file permissions for some reason). Now, the web server can also write to user's files. So while I've prevented different users taking advantage of each other's scripts, I've made it so that in the event that some application has a remote code injection vulnerability, it now has not only read access but also write access to all that user's scripts and website. How can I deal with this? One idea I've had is to create a second user account for each user account in the system, so that each user has their own user account, and all their scripts are run under another user account. But that seems cumbersome.

    Read the article

  • Running PHP scripts as the owner of the PHP file: security issues

    - by thomasrutter
    I'm using suexec to ensure that PHP scripts (and other CGI/FastCGI apps) are run as the account holder associated with the relevant virtual host. This allows for securing each users' scripts from reading/writing by other users. However, it occurs to me that this opens up a different security hole. Previously, the web server ran as an unprivileged user, with read-only access to user's files (unless the user changed the file permissions for some reason). Now, the web user can also write to user's files. So while I've prevented different users taking advantage of each other's scripts, I've made it so that in the event that some application has a remote code injection vulnerability, it now has not only read access but also write access to all that user's scripts and website. How can I deal with this? One idea I've had is to create a second user account for each user account in the system, so that each user has their own user account, and all their scripts are run under another user account. But that seems cumbersome.

    Read the article

  • Connecting multiple ColdFusion 10 instances to a single Apache 2.2 server

    - by Adam Cameron
    This is on Windows 7 Home Premium edition. I have got two ColdFusion 10 (updater 2) instances: "cfusion" (the default one), and "scratch". I have got a single instance of Apache 2.2 running. Within Apache, I have set up two virtual hosts, each of which needs to be served by a different ColdFusion instance. Each of the CF instances serves files fine via Tomcat's internal web server. Apache serves vanilla HTML files fine too. So both CF instances, and both virtual hosts separately work OK. I can get wsconfig.exe to connect either one of the CF instances to the Apache server, and serve CF files via Apache & that instance. However I cannot find a way of connecting the second CF instance to Apache as well, so that both CF instances are conected, each serving one of the virtual hosts. WSConfig doesn't seem to understand the notion of "multiple CF instances", and the changes it makes to the httpd.conf (via mod_jk.conf) does not seem to be implemented in such a way as to accommodate multiple CF instances talking to a single Apache instance, or multiple virtual hosts. I freely admit to not being confident enough with how mod_jk (or even really httpd.conf) works to be able to guess if I can change stuff to make it work. If I try to add the second CF instance using WSConfig, I just get a message "the web server is already configured for ColdFusion". Be that as it may... not the instance of ColdFusion I want to connect it to! If I remove the existing connector to whichever instance is already connected, I can then connect the other one no problems. Not that this helps, but it demonstrates that the CF instance can connect to Apache. This all used to be fairly straight fwd under older versions of CF and JRun :-( The only docs I have found are on the "Connect multiple Apache virtual hosts on a web server to a single ColdFusion server" page, but that specifically only deals with a single CF instance. There is no equivalent page for multiple CF instances. I'm kinda hoping I can move some of the mod_jk config into my virtual host entries in httpd-vhosts.conf (this is how it used to work for JRun), but I've no idea what to put where. I think I've covered all the necessary info here? If not, sing out and I'll add more. Thanks. PS: tried to specifically tag this as "ColdFusion-10" as the answer will be different from previous CF versions, but it won't let me cos my rep on this site is too low (odd how it doesn't consider my rep from other S/O sites...). If someone with sufficient rep can add it, that'd be cool: it's probably a valid tag to have. Ta.

    Read the article

  • Can I use Zoneedit to do URL rewrite?

    - by chilly-child
    This is our scenario: Our DNS is hosted by a company. They don't manage the DNS. We use Zoneedit (www.zoneedit.com) to manage the DNS such as nameservers, CNAMEs, etc... Then we have our web host where we just have our files hosted. We have a subdomain created on zoneedit. We would like to do a URL rewrite so that subdomain.ourdomain.com is displayed as www.ourdomain.com/subdomain. Do I use Zoneedit to do the URL rewrite or the web host or the DNS host? I checked the Zoneedit docs but I could not find a way to do a URL rewrite. Need some advice. Thanks

    Read the article

  • MySql service stops under 2008 r2 x64

    - by volody
    I have installed MySql 5.5 server under windows 2008r2 x64 Apparently I can see that MySql service stops even if is configured to start automatically What can I do to find out why this is happening? MySql database is used as backend of ASP.Net web site Is it possible that web site was not active for a while and system stop mysql service? Update: It was mysql-5.5.7-rc-winx64. I could be an issue with this version (release candidate). Now I am trying to install mysql-5.5.8-winx64 And I have an issue with configuring MySql to work using name pipes I did uncheck use of TCP/IP protocol and configuration wizard just hangs Update: I have found workaround. It is required to configure MySql to use TCP/IP first, then reconfigure to use named pipes It looks like this link also has some information about the possible problems How should I diagnose ERROR 1045 during MySQL installation?

    Read the article

  • What ports do I need open for IMAP connections

    - by iamjonesy
    I'm developing a web application that connects to an IMAP mailbox and fetches emails as part of it's functionality. The application is PHP and I'm connecting like this: public function connect() { /* connect to gmail */ $hostname = '{imap.gmail.com:993/imap/ssl}INBOX'; $username = $this->username; $password = $this->password; /* try to connect */ $this->inbox = imap_open($hostname,$username,$password) or die('Cannot connect to Gmail: ' . imap_last_error()); } Developing locally on my mac this was fine, I was able to connect and get emails. However now that I've put the app on my web hosts server I'm getting the following error: Cannot connect to Gmail: Can't connect to gmail-imap.l.google.com,993: Connection timed out After checking with my hosting provider they told me outgoing connections on port 993 are blocked. Is there anyway around this? Otherwise I need to upgrade to a dedicated server :S

    Read the article

  • How to configure Windows 2008 R2 server for LAN and wireless internet connections

    - by Alchemical
    For special testing purposes, we need a Windows server to allow the following: A team member can log in remotely to the server. When remotely logged in, they can disconnect the wireless connection, perform a few tests, and then reconnect the wireless connection. In general, the LAN connection would just be used for the remote login, the wireless connection would be used for performing tests including using a web browser to test certain web sites, etc. How can we successfully configure the server to support 2 network connections like this? (A regular LAN connection + a wireless connection). And also make sure that the tests we perform using the browser utilize the wireless connection for the outgoing internet activity.

    Read the article

< Previous Page | 762 763 764 765 766 767 768 769 770 771 772 773  | Next Page >