Search Results

Search found 8020 results on 321 pages for 'webcenter sites'.

Page 20/321 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Is this Anti-Scraping technique viable with Crawl-Delay?

    - by skibulk
    I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute. I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold. Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

    Read the article

  • Help with google pages; using them in general

    - by Sugarcube
    Is there any way someone could help me with building a website using google pages? I cannot for the life of me figure out how to do much more than create a new page, and I have been up all night. I thought I knew computers and websites but apparently I need a refresher course. Please, if you can help, I would be most appreciative. -Sugarcube Edit: Seems this is too vague. But honestly, I don't even know where to begin with the way Google has it set up. I tried to set one up with a template and a layout, but the layout has pages that I did not create and that I don't know how to edit to make them suitable to my website. I guess if I had to start somewhere it would be on how to make it do what I want it to do.

    Read the article

  • ???? : ?????????????????????????? ~??????????????????SCSK??????

    - by Shinji Watanabe
    ??????????????????????????????????????????·?????????????????????????????????????Web??????????????????????????????????·??????????????? ?????????????·???????????????????????Web CMS?EC????????????????????????????????????????????????????????Web??????????????????·???????????????Web??????????????????????????????·??????????Web???????????????? ???????? ?? 2012?10?10?(?)14:00~17:15(??13:30~) ?? Web?????????????????-·??????????????????????Web????????????????????????????????·??????????Web????????????????????????????Web???????????????Web?????????????????????????????? ?? ???????????????????2-4-27 ????? 9F???:??? ???? ?????? 10????? ??5???????? ??3?(????????????????)JR? ????? ?????? ??10?JR??? ?????? ????? ??5??? ???? ?????? 7????? ??5?????? ?? ?? ?? 50????????????????????????????????????????????????????????????????(??????????????????????????????????)?????????????????????????????????????????????????????????????????????????????????????????????????????1?2???????????????????????????????????? ?? ??????????????????????????SCSK???? ???????? ????????? ????? 13:30-14:00 ?? 14:00-14:10 1)??????????????? Fusion Middleware?????? ????/??????? ? ? 14:10-15:00 2)?User Experience ??? ~ ????????????????????????????????? ????????CEO ?? ????????????????????????????????????????????·?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? 15:00-15:45 3)?B2C?B2B?????!?????????????????????3?????????????? ?????????????? CRM/HCM???????????? ?? ?? ?E?????????????????????????????????????????????????????????????B2B?B2C?????????????????????????????????????????????? ????????????????????????????????????????????????????? ???????????????????????????????????????????3??????????? 15:45-16:00 ?? 16:00-16:45 4)????????????????No.1 ??????????????????????SCSK???? ?????????? ??????????????????????? WEB???????? ??? ? ? ????????????????Web????PfizerPRO??????????????????????????????????????????????????????????????????????????????????????????????????????????????? ????No.1(?)? ???? ???????????SCSK??????Web???????????????????????????????????????????????????????????????????????????????????????????????????? ???:??????·??·???2011?12?14??? 2011?7~9????????????????? 16:45-17:15 5)????Web????????????????~???????????????????????????~?SCSK???? IT??????????? ???????????????? ?? ??????????????????????????????????????Web????????????????????????????????????????????CMS????????Web???????????????????????????????????????????? ??????????????????????????????

    Read the article

  • Web hosting for multiple web sites providing system isolation

    - by Justin
    We have a small number of projects where we expect the client will not be maintaining the installed versions of applications we install to power the site (such as Drupal). Given that an important part of security is keeping things updated, we don't want to host these projects on our Plesk-powered dedicated servers that currently host lots of our other client's websites. Our goal is to find a host where we can deploy isolated instances (be these slices, virtual servers, grid servers, etc) for each individual (or groups of 2-3) web sites as we need them. These instances would be completely separate, so that if one web site were hacked it would not impact any other site. Typical hosting requirements: Linux Apache PHP 5 MySQL Supports Drupal Ability to setup a cron task (but we don't need SSH access) Daily backups Virtualized/cloud hosting (we want to avoid shared) Pricing per site is around $25/month OS is patched automatically Some options we have considered but won't work: MediaTemple: Two major data center-wide security incidents and recent downtime foster doubt about this host's technical ability. Slicehost: This would require us to manage the entire server, which we don't want to do. Rackspace Cloud Sites (formerly Mosso): No backup options. Do you have any recommended hosting options for given these requirements?

    Read the article

  • "Server Unavailable" and removed permissions on .NET sites after Windows Update [closed]

    - by andrewcameron
    Our company has five almost identical Windows 2003 servers with the same host, and all but one performed an automatic Windows Update last night without issue. The one that had problems, of course, was the one which hosts the majority of our sites. What the update appeared to do was cause the NETWORK user to stop having access to the .NET Framework 2.0 files, as the event log was complaining about not being able to open System.Web. This resulted in every .NET site on the server returning "Server Unavailable" as the App Domains failed to be initialise. I ran aspnet_regiis which didn't appear to fix the problem, so I ran FileMon which revealed that nobody but the Administrators group had access to any files in any of the website folders! After resetting the permissions, things appear to be fine. I was wondering if anyone had an idea of what could have caused this to go wrong? As I say, the four other servers updated without a problem. Are there any known issues involved with any of the following updates? My major suspect at the moment is the 3.5 update as all of the sites on the server are running in 3.5. Windows Server 2003 Update Rollup for ActiveX Killbits for Windows Server 2003 (KB960715) Windows Server 2003 Security Update for Internet Explorer 7 for Windows Server 2003 (KB960714) Windows Server 2003 Microsoft .NET Framework 3.5 Family Update (KB959209) x86 Windows Server 2003 Security Update for Windows Server 2003 (KB958687) Thanks for any light you can shed on this.

    Read the article

  • added ip-based virtual host to sites-available and created symlink to sites-enabled...but new domain

    - by lililili
    I added ip-based virtual host to sites-availble and created symlink to sites-enabled, but new domain times out. When i navigate to mynewdomain.com it says connection timed out. NameVirtualHost 12.12.12.12 <VirtualHost 12.12.12.12> ServerAdmin webmaster@localhost ServerName newdomain.com DocumentRoot /var/www/newdomain.com <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined ServerSignature On Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost>

    Read the article

  • "Server Unavailable" and removed permissions on .NET sites after Windows Update

    - by tags2k
    Our company has five almost identical Windows 2003 servers with the same host, and all but one performed an automatic Windows Update last night without issue. The one that had problems, of course, was the one which hosts the majority of our sites. What the update appeared to do was cause the NETWORK user to stop having access to the .NET Framework 2.0 files, as the event log was complaining about not being able to open System.Web. This resulted in every .NET site on the server returning "Server Unavailable" as the App Domains failed to be initialise. I ran aspnet_regiis which didn't appear to fix the problem, so I ran FileMon which revealed that nobody but the Administrators group had access to any files in any of the website folders! After resetting the permissions, things appear to be fine. I was wondering if anyone had an idea of what could have caused this to go wrong? As I say, the four other servers updated without a problem. Are there any known issues involved with any of the following updates? My major suspect at the moment is the 3.5 update as all of the sites on the server are running in 3.5. Windows Server 2003 Update Rollup for ActiveX Killbits for Windows Server 2003 (KB960715) Windows Server 2003 Security Update for Internet Explorer 7 for Windows Server 2003 (KB960714) Windows Server 2003 Microsoft .NET Framework 3.5 Family Update (KB959209) x86 Windows Server 2003 Security Update for Windows Server 2003 (KB958687) Thanks for any light you can shed on this.

    Read the article

  • wget hangs in http request sent awaiting response in some sites

    - by gkr
    Using Ubuntu 12.04. wget hangs in http request sent, awaiting response... in some sites. Browser's are not opening sites that are failed in wget. But in WinXP everything works. This works gkr@gkr-desktop:~/Documents/curl$ wget google.com --2012-06-12 21:29:37-- http://google.com/ Resolving google.com (google.com)... 74.125.236.174, 74.125.236.160, 74.125.236.161, ... Connecting to google.com (google.com)|74.125.236.174|:80... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: http://www.google.com/ [following] --2012-06-12 21:29:38-- http://www.google.com/ Resolving www.google.com (www.google.com)... 74.125.236.179, 74.125.236.180, 74.125.236.176, ... Connecting to www.google.com (www.google.com)|74.125.236.179|:80... connected. HTTP request sent, awaiting response... 302 Found Location: http://www.google.co.in/ [following] --2012-06-12 21:29:38-- http://www.google.co.in/ Resolving www.google.co.in (www.google.co.in)... 74.125.236.184, 74.125.236.191, 74.125.236.183, ... Connecting to www.google.co.in (www.google.co.in)|74.125.236.184|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `index.html.3' [ ] 13,383 --.-K/s in 0.04s 2012-06-12 21:29:39 (308 KB/s) - `index.html.3' saved [13383] gkr@gkr-desktop:~/Documents/curl$ This site just stops/hangs in awaiting response. gkr@gkr-desktop:~/Documents/curl$ wget grooveshark.com --2012-06-12 21:27:29-- http://grooveshark.com/ Resolving grooveshark.com (grooveshark.com)... 8.20.213.76 Connecting to grooveshark.com (grooveshark.com)|8.20.213.76|:80... connected. HTTP request sent, awaiting response... ^C gkr@gkr-desktop:~/Documents/curl$ Thanks

    Read the article

  • "Server Unavailable" and removed permissions on .NET sites after Windows Update

    - by tags2k
    Our company has five almost identical Windows 2003 servers with the same host, and all but one performed an automatic Windows Update last night without issue. The one that had problems, of course, was the one which hosts the majority of our sites. What the update appeared to do was cause the NETWORK user to stop having access to the .NET Framework 2.0 files, as the event log was complaining about not being able to open System.Web. This resulted in every .NET site on the server returning "Server Unavailable" as the App Domains failed to be initialise. I ran aspnet_regiis which didn't appear to fix the problem, so I ran FileMon which revealed that nobody but the Administrators group had access to any files in any of the website folders! After resetting the permissions, things appear to be fine. I was wondering if anyone had an idea of what could have caused this to go wrong? As I say, the four other servers updated without a problem. Are there any known issues involved with any of the following updates? My major suspect at the moment is the 3.5 update as all of the sites on the server are running in 3.5. Windows Server 2003 Update Rollup for ActiveX Killbits for Windows Server 2003 (KB960715) Windows Server 2003 Security Update for Internet Explorer 7 for Windows Server 2003 (KB960714) Windows Server 2003 Microsoft .NET Framework 3.5 Family Update (KB959209) x86 Windows Server 2003 Security Update for Windows Server 2003 (KB958687) Thanks for any light you can shed on this.

    Read the article

  • Trouble Letting Users Get to Certain Sites through Squid Proxy

    - by armani
    We have Squid running on a RHEL server. We want to block users from getting to Facebook, other than a couple specific sites, like our organization's page. Unfortunately, I can't get those specific pages unblocked without allowing ALL of Facebook through. [squid.conf] # Local users: acl local_c src 192.168.0.0/16 # HTTP & HTTPS: acl Safe_ports port 80 443 # File containing blocked sites, including Facebook: acl blocked dst_dom_regex "/etc/squid/blocked_content" # Whitelist: acl whitelist url_regex "/etc/squid/whitelist" # I do know that order matters: http_access allow local_c whitelist http_access allow local_c !blocked http_access deny all [blocked_content] .porn_site.com .porn_site_2.com [...] facebook.com [whitelist] facebook.com/pages/Our-Organization/2828242522 facebook.com/OurOrganization facebook.com/media/set/ facebook.com/photo.php www.facebook.com/OurOrganization My biggest weakness is regular expressions, so I'm not 100% sure about if this is all correct. If I remove the "!blocked" part of the http_access rule, all of Facebook works. If I remove "facebook.com" from the blocked_content file, all of Facebook works. Right now, visiting facebook.com/OurOrganization gives a "The website declined to show this webpage / HTTP 403" error in Internet Explorer, and "Error 111 (net::ERR_TUNNEL_CONNECTION_FAILED): Unknown error" in Chrome. WhereGoes.com tells me the URL redirects for that URL goes like this: facebook.com/OurOrganization -- [301 Redirect] -- http://www.facebook.com/OurOrganization -- [302 Redirect] -- https://www.facebook.com/OurOrganization I tried turning up the debug traffic out of squid using "debug_options ALL,6" but I can't narrow anything down in /var/log/access.log and /var/log/cache.log. I know to issue "squid -k reconfigure" whenever I make changes to any files.

    Read the article

  • All computers on network get stuck waiting for some sites indefinetely

    - by zacaj
    This happens across three computers, running windows 7 and Ubuntu, firefox, opera, and chrome (all latest versions). I am connected to the internet through a Verizon wireless usb modem. When I try to open some web pages they will never finish loading (and usually never even show anything). The status bar at the bottom of the browser will display "Waiting for X" The servers it gets stuck on include: platform.twitter.com s7.addthis.com connect.facebook.net ajax.googleapis.com 2mdn.net Ive been getting away with just blocking them in AdBlock up until now, however the last two have been causing problems. There are some sites which require googleapis.com to load correctly, and some that wont ever load unless its blocked. eBay requires access to 2mdn.net to load pictures. On top of this its getting really annoying having to update AdBlock across all these computers whenever a new site pops up. I'm hoping there's some easier way to fix this? The different sites causing the freeze indicate to me that it's either a problem on my end (somehow?) or some server side software that got updated with a new bug?

    Read the article

  • Stuck with Documentum Still? Do MORE with Oracle WebCenter!

    - by Michael Snow
    WEBCAST TODAY!! 03/22/12 Do you need to lower costs? Raise Productivity? Foster Innovation? Improve Online Engagement? But you’re still stuck with Documentum? Step away from the ledge – there is hope – let us help you. Top 4 Content Imperatives · Lower Costs - Reduce labor, maintenance fees, storage and electrical consumption · Raise Productivity - Automation and integration, communication, findability · Foster Innovation - Enable collaboration, expertise location · Improve Online Engagement – enable user-driven, dynamic marketing initiatives With the coming technology wave we see four content imperatives. Every organization has had to reduce costs, cost cutting has become a way of life. Everyone is working three jobs as positions are eliminated. And so we have to reduce labor, reduce maintenance, and reduce money we are wasting on things like storing content that is redundant or no longer useful. We also, to fill that gap, need to raise productivity. Knowledge workers represent the fastest growing segment of the workforce, accounting for 40%-75% of the employees at organizations in sectors like financial services, life sciences, healthcare and retail.  What’s more, their wages total 18 percent of the United States GDP. And so we can’t afford information systems that don’t let our top performers be the best they can be. We look to automate the content processes, provide ways to integrate that content into our processes, provide communication to make decisions, and to make content more findable so people can make the right decision and move the process forward. And really to get ourselves out of the current financial status, we can only cut costs so far. We have to innovate out of economic tough times – to find new products and new markets. And to enable the innovation process, we have to enable collaboration and expertise location. So much of innovation is about building on innovations that have come before. To solve problems, we have to be able to find what our organization has already created. We find that problems we need to solve have already been solved if we can find the right document, the right person. So we have to provide systems that enable us to stand on the shoulders of our organization’s accomplishments. Good content drives great marketing. Online engagement is growing as an absolute necessity for modern growing marketing organizations that require the business users be enabled for dynamic marketing content creation, updates and targeted content creation and management. Unfortunately – if you are currently stuck with Documentum, you are really lacking in your Web Experience Management capabilities. Documentum previously used FatWire for web publishing. Now FatWire is part of Oracle. Oracle provides powerful web engagement capabilities: Increase sales and loyalty by optimizing online engagement Create, manage and moderate contextually relevant, targeted and interactive online experiences Optimize customer engagement across, web, mobile and social channels Manage large scale multichannel global online presence with integration to enterprise applications Enable business users to control their content and make their own updates Publish content from native files – enable navigation of project documents, procedures, policy information Enable content display and updates from existing web applications – one click to drag and drop content management functionality So you get the ability to self-publish information and make it navigable, to move the process of publishing from IT to business users, and the ability to address a whole new area of user engagement with web experience management. So… if you are still stuck with Documentum and don’t know what to do – contact us – not only will Oracle help you step away from the ledge, but also with the MoveOff Documentum program, we are offering you a way – trade-in your Documentum licenses for a 100% credit on Oracle WebCenter. How’s that for a nice bonus? It’s time to stop maintaining Documentum, and to start innovating with Oracle WebCenter. Learn More Here! To learn more about what Oracle WebCenter can offer you today – join us for a webcast – your eyes will be opened to all that’s possible. Do More with WebCenter: Extend Beyond Content Management

    Read the article

  • django dynamically deduce SITE_ID according to the domain

    - by dcrodjer
    I am trying to develop a site which will render multiple customized sites according to the domain name (subdomain to be more precise). My all the domain names are redirected to the So for each site there will be a corresponding model which defines how the site should look (SITE - SITE_SETTINGS) What will be the best way to utilize the django sites framework to get the SITE_ID of the current site from the domain name instead of hard-coding it in the settings files (django sites documentation) and run database queries, render the views accordingly? If using multiple settings file is my only option can this (wsgi script handle domain name) be done? Update So finally, following lukes answer, what I will do is define a custom middleware which makes the views available with the important vars required according to the domain. And as far as sitemaps and comments is concerned, I will have to customize sitemaps app and a custom sites model on which the other models of sites will be based. And since the comments system is based on the hard-coded sitemap ID I can use it just as is on the models (models will already be filtered according to the site based on my sites framework) though the permalink feature will have to be customized. So a lot of customization. Please suggest if I am going wrong anywhere in this because I have to ensure that the features of the project are optimized. Thanks!

    Read the article

  • Disable cookies for selected sites

    - by acidzombie24
    So far all sites have played nice but recently i ran into a site which aggressively tries to get users to sign up and pretty much puts the site on lockdown after 10 pageviews without signing up. What firefox extension can i use to either freeze or disable cookies for this one specific site? I dont want to whitelist every site i visit because one site is giving me trouble.

    Read the article

  • Drupal Sites Backup and Restore to Amazon S3

    - by Ngu Soon Hui
    There are modules written for database backup and files backup, but what I want is a complete backup to Amazon S3 or other cloud platforms, for both the data, and the sites. Currently as it stands, I have to separately and manually backup the two. Is there any module/tool/already-written-script that allows me to do that?

    Read the article

  • Python Django sites on Apache+mod_wsgi with nginx proxy: highly fluctuating performance

    - by Halfgaar
    I have an Ubuntu 10.04 box running several dozen Python Django sites using mod_wsgi (embedded mode; the faster mode, if properly configured). Performance highly fluctuates. Sometimes fast, sometimes several seconds delay. The smokeping graphs are al over the place. Recently, I also added an nginx proxy for the static content, in the hopes it would cure the highly fluctuating performance. But, even though it reduced the number of requests Apache has to process significantly, it didn't help with the main problem. When clicking around on websites while running htop, it can be seen that sometimes requests are almost instant, whereas sometimes it causes Apache to consume 100% CPU for a few seconds. I really don't understand where this fluctuation comes from. I have configured the mpm_worker for Apache like this: StartServers 1 MinSpareThreads 50 MaxSpareThreads 50 ThreadLimit 64 ThreadsPerChild 50 MaxClients 50 ServerLimit 1 MaxRequestsPerChild 0 MaxMemFree 2048 1 server with 50 threads, max 50 clients. Munin and apache2ctl -t both show a consistent presence of workers; they are not destroyed and created all the time. Yet, it behaves as such. This tells me that once a sub interpreter is created, it should remain in memory, yet it seems sites have to reload all the time. I also have a nginx+gunicorn box, which performs quite well. I would really like to know why Apache is so random. This is a virtual host config: <VirtualHost *:81> ServerAdmin [email protected] ServerName example.com DocumentRoot /srv/http/site/bla Alias /static/ /srv/http/site/static Alias /media/ /srv/http/site/media WSGIScriptAlias / /srv/http/site/passenger_wsgi.py <Directory /> AllowOverride None </Directory> <Directory /srv/http/site> Options -Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> Ubuntu 10.04 Apache 2.2.14 mod_wsgi 2.8 nginx 0.7.65 Edit: I've put some code in the settings.py file of a site that writes the date to a tmp file whenever it's loaded. I can now see that the site is not randomly reloaded all the time, so Apache must be keeping it in memory. So, that's good, except it doesn't bring me closer to an answer... Edit: I just found an error that might also be related to this: File "/usr/lib/python2.6/subprocess.py", line 633, in __init__ errread, errwrite) File "/usr/lib/python2.6/subprocess.py", line 1049, in _execute_child self.pid = os.fork() OSError: [Errno 12] Cannot allocate memory The server has 600 of 2000 MB free, which should be plenty. Is there a limit that is set on Apache or WSGI somewhere?

    Read the article

  • Sharepoint 2010 My-sites Active Directory connection error

    - by Tuomas Hietanen
    There is a good tutorial of creation of Sharepoint 2010 My-sites here. In my environment I can't get the connection to Active Directory work, as it fails with: "An operation error occurred." There is a text "For Active Directory connections to work, this account must have directory sync rights." but I don't know what that means... My question is: There is nothing in the Event Log. Where is the error-log? :-)

    Read the article

  • Backup solutions for Rackspace Cloud Sites

    - by Daniel A. White
    What options do I have for backing up the content from a Rackspace Cloud Sites including files and databases? I know they have cron jobs, but I am not sure what options I have when it comes to that. Here are some of the things the Cron jobs they support. http://cloudsites.rackspacecloud.com/index.php/How%5Fdo%5FI%5Fschedule%5Fa%5Fcron%5Fjob%3F

    Read the article

  • How to stop videos on news sites from overriding my speaker mute

    - by Curious
    Just recently I have found when clicking on news stories that their advertisement and news videos start up automatically and that if I have set the speakers to mute that it overrides this. I then re-mute the speakers and a few seconds later it is overriden and the sound starts up again. I think this will be a growing common problem as it seems a "new trick" by hungry media sites. Can someone please let me know how to stop it happening?

    Read the article

  • Rackspace Cloud Sites API (Not Cloud Servers)

    - by Jeff
    I'm looking for a way to pull data from my Rackspace Cloud SITES account. The data I want to pull is bandwidth, diskspace, and compute cycles (all available from control panel). I'd like to set up my own warning system, to be notified if I'm close to my limits on any given month. Does anyone know of a way/API to do this?

    Read the article

  • Easy way to deploy PHP sites from git

    - by Leopd
    I'm looking for recommendations on how to automate / simplify deployment from a git repository (github) to a hosting service. The hosting service supports FTP (yuck) / SSH / SFTP access. Any good tools out there to give push-button deployment of new revisions? I know it's not a hard script to write, but when you start thinking about things like roll-back and multiple sites, it gets complicated enough that I'd rather not re-invent the wheel.

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >