Search Results

Search found 24735 results on 990 pages for 'site ranking'.

Page 676/990 | < Previous Page | 672 673 674 675 676 677 678 679 680 681 682 683  | Next Page >

  • Reverse proxy with SSL and IP passthrough?

    - by Paul
    Turns out that the IP of a much-needed new website is blocked from inside our organization's network for reasons that will take weeks to fix. In the meantime, could we set up a reverse proxy on an Internet-based server which will forward SSL traffic and perhaps client IPs to the external site? Load will be light. No need to terminate SSL on the proxy. We may be able to poison DNS so original URL can work. How do I learn if I need URL rewriting? Squid/apache/nginx/something else? Setup would be fastest on Win 2000, but other OSes are OK if that would help. Simple and quick are good since it's a temporary solution. Thanks for your thoughts!

    Read the article

  • SharePoint 2010 MySites - Host on separate servers

    - by Chris W
    We're playing with the SP 2010 Beta ahead of a planned deployment later this year in an academic environment. We anticipate that the majority of traffic will be through MySites when everything is provisioned so we're looking at how we can plan our SP topology to scale nicely. An initial thought is to run the main portal on one server, host "Student" MySites on one server and "Staff" on another. Is it acutally possible to do this easily or are we going down a bad path? Specifically - can we have 2 different MySites site collections, each hosted on a dedicated server? If so, can we configure SharePoint to work out from the users's logon account type of user they are and route them to the correct server?

    Read the article

  • Working with Google Webmaster Tools

    - by com
    My first question is about Crawl errors in Google Webmaster Tools. Crawl errors is devided into few sections. One of them is HTTP. I assume that all broken links in HTTP was somehow found by crawler, this is not the links from sitemap. If this was found by scanning all sitemap pages for links, why it doesn't mention what was the source page, like in sitemap section with column Linked From. And what the meaning of Linked From, I thought if the name of section is sitemap, therefore all URLs should be taken from sitemap, so why there is Linked From? The second question, what is the best way to trreat searching on the site. How come the searching result page are getting indexed? Because of the fact that all searching result page are getting indexed, I have to many page in Linked From. What's the right practice? Question three: In order to improve response time in WMT, can I redirect all crawler's requests to designated free web server? Is this good practice? Question four: How should I treat Google Analytics Code (with parameters PageView, PageLoadTime), in the case user request non existing page, should I render Google code or not? Right now I use Google Analytics Code on the common template page, such that every page, also non existing page with error message contains Google Analytics Code, it seems like it has influence on WMT.

    Read the article

  • Enable mod_deflate per directory level

    - by z1_jabbar
    I am using following code, when i access site it only compress all the jsp inside all the urls path under /abc but it ignores all the js and css files. I want to compress js and css files under all the subfolders in /abc path? How I can do this. Thanks! <LocationMatch "/abc"> <IfModule mod_deflate.c> SetOutputFilter DEFLATE # Don't compress images SetEnvIfNoCase Request_URI \ \.(?:gif|jpe?g|png)$ no-gzip dont-vary #Don't compress PDFs SetEnvIfNoCase Request_URI \.pdf$ no-gzip dont-vary #Don't compress compressed file formats SetEnvIfNoCase Request_URI \.(?:7z|bz|bzip|gz|gzip|ngzip|rar|tgz|zip)$ no-gzip dont-vary <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> </IfModule> </LocationMatch>

    Read the article

  • How do I configure ubuntu server's iptables to allow java without opening the floodgates?

    - by rofls
    I'm new to servers, so please bear with me. I have my amateur site running. Problem is, I followed Rackspace's instructions on setting up iptables and am pretty sure that's why the java server I'm trying to use on port 8080 isn't working (it runs the script but my android test app doesn't connect to it). When I try running the same java server script on port 80 it doesn't even start. I also ran nmap on my domain and saw that indeed only port 80 and 22 (for ssh) are responding. Is it possible to run Java and apache happily on the same server? If so, how can I configure my iptables correctly. (I'm aware that I should probably do some sort of filtering in the java server itself, but will figure that out later).

    Read the article

  • block access to certain website types

    - by frustrated teacher
    Need to block access to certain website types without listing each URL to block. Students at secondary school are going to porn sites. Need to be able to block all such access without having to list each possible site URL. Having the Content -- Ratings tab set to None for all categories on the ratings files listed on my computers does not prevent access. Unchecking users may access sites with no rating, even with the security settings set to High, still allows the porn sites to come up. If that is checked, then ONLY listed sites can open and students would not be able to do any research via google, for example. I would rather not have to continue checking each computer and blocking sites as they find them.

    Read the article

  • Advertising on personalized pages behind a login

    - by johneth
    I am currently building a web app which requires a user to log in. After they log in, they can see the content they've added to the web app, and things that the web app has done with the content they added. The URL structure won't differentiate different users (e.g. all user's 'homepage' would be example.com/home, not something like example.com/username/home). This is much the same way that Facebook works (all FB user's messages are at facebook.com/messages, for example). This presents a problem with advertising. I know that you can use AdSense behind a login, but as far as I'm aware, that's for things like forums, where everyone sees the same things (which wouldn't be the case in this site). I also know that I could put AdSense on the pages without allowing it to log in, which would produce inferior ads. I'm fairly certain it would be against the Terms of Service to give AdSense a login to a 'dummy' account with typical content, as it would not be seeing the same thing as every other user (which is impossible, as they all see different things). So, my question is: Is there an ad network, or other method, that can serve ads behind a login, maybe based on keywords rather than content?

    Read the article

  • Default documentroot apache does not work

    - by James Wise
    I have apache version 2.2 and php 5.3.15 on a single server. I configured virtual hosting and a default vhost. 0_default_.conf - goes to /var/www/default sub.domain.com.conf - goes to /var/www/sub.domain.com My question is, how could I set the default documentroot to sub.domain.com permanently? That means all request should be redirected to sub.domain.com. I try to remove 0_default_.conf but when viewing the page it display the php source code of sub.domain.com. Here is my configurations -- http://pastebin.com/4e3awUJ4 Although I can create index.php to /var/www/default and permanently redirect to sub.domain.com site but it's not viable solution for me because what if I didn't point the ip address of sub.domain.com to the server so user cannot view that subdomain. I would appreciate if anyone could share their knowledge and wisdom. Thanks. JamesW

    Read the article

  • Wordpress serving PHP but not CSS or JS

    - by Jason
    I'm trying to set up an Amazon EC2 instance to run a Django app and a WP instance side by side, differing only by the incoming URL. Initially, accessing the site via mysite.com/wordpress worked, but I also needed to catch the incoming requests from a subdomain address blog.mysite.com. To do that, I created a default file in /etc/apache2/sites-enabled and included two virtualhost directives, one of which was <VirtualHost *:80> ServerName www.blog.mysite.com <Directory /var/www/wordpress> Order deny,allow Allow from all </Directory> </VirtualHost> This created some errors with the other virtualhost, so I restored the default 000-default file configuration and restarted. Now, accessing mysite.com/wordpress takes forever, and even then the CSS and JS files are not loading. Iside the Firebug Net tab, I can see the HTML response, but the CSS and JS files are not loading at all. What happened here?

    Read the article

  • Danger in running a proxy server? [closed]

    - by NessDan
    I currently have a home server that I'm using to learn more and more about servers. There's also the advantage of being able to run things like a Minecraft server (Yeah!). I recently installed and setup a proxy service known as Squid. The main reason was so that no matter where I was, I would be able to access sites without dealing with any network content filter (like at schools). I wanted to make this public but I had second thoughts on it. I thought last night that if people were using my proxy, couldn't they access illegal materials with it? What if someone used my proxy to download copyright material? Or launched an attack on another site via my proxy? What if someone actually looked up child pornography through the proxy? My question is, am I liable for what people use my proxy for? If someone does an illegal act and it leads to my proxy server, could I be held accountable for the actions done?

    Read the article

  • How can I use the proxy settings on Epic privacy browser to log on to Facebook?

    - by EddieN120
    I love the Epic privacy browser because it is built from the ground up to enhance privacy. It's built on Chromium but because it has stripped out all code that tracks users across the Internet, pages load faster and things work snappier. With one click you can enable a proxy to hide your IP address, sort of like Chrome's "Incognito" mode on steroids. But there's a problem: if I load Epic, go to facebook.com, log in, and then click the proxy button, I can use Facebook for a while. But eventually, Facebook would throw up an error screen, saying that it thinks that my account has been hacked, and then it would make me verify my identity, force me to change my password, etc. I've had to change my password four times in as many days, which is very annoying. Now I turn on the proxy for browsing on to every other site but Facebook. Question: how can I use the proxy settings on Epic privacy browser to successfully log onto and use Facebook?

    Read the article

  • IIS + PHP + Page with lots of images = Intermittent 403 errors

    - by samJL
    I am using an up-to-date Server 2008 R2 Datacenter, running IIS 7.5 and PHP 5.3.6/FastCGI On PHP pages with lots of images (60+), some of the images fail to load It is not always the same images-- on each page refresh an image that worked previously may not load, while an image that did not now does Looking at the Net tab in Firebug reveals that the failing image requests are 403 errors All of the images are located on the server in question, and the images directory has the correct permissions I believe this problem is the result of a limit on requests All of my attempts at researching this problem point to maxConnections setting in IIS, yet mine is set at the highest/default of 4294967295 (maxBandwidth too) I am also running a ColdFusion site on the same IIS installation, and it does not suffer from 403's on pages with lots of images I am left thinking that there is another connection limit (in PHP or FastCGI?) overriding the IIS connection limit I don't see anything that looks like a request limit in the php.ini, what am I missing? Any help would be appreciated, thank you

    Read the article

  • At the Java DEMOgrounds - ZeroTurnaround and its LiveRebel 2.5

    - by Janice J. Heiss
    At the ZeroTurnaround demo, I spoke with Krishnan Badrinarayanan, their Product Marketing Manager. ZeroTurnaround, the creator of JRebel and LiveRebel, describes itself on their site as a company “dedicated to changing the way the world develops, tests and runs Java applications."“We just launched LiveRebel 2.5 today,” stated Badrinarayanan, “which enables companies to embrace the concept and practice of continuous delivery, which means having a pipeline that takes products right from the developers to an end-user, faster, more frequently -- all the while ensuring that it’s a quality product that does not break in production. So customers don’t feel the discontinuity that something has changed under them and that they can’t deal with the change. And all this happens while there is zero down time.”He pointed out that Salesforce.com is not useable from 3 a.m. to 5 a.m. on Saturday because they are engaged in maintenance. “With LiveRebel 2.5, you can unify the whole delivery chain without having any downtime at all,” he said. “There are many products that tell customers to take their tools and change how they work as an organization so that you they have to conform to the way the tool prescribes them to work as an application team. We take a more pragmatic approach. A lot of companies might use Jenkins or Bamboo to do continuous integration. We extend that. We say, take our product, take LiveRebel okay, and integrate it with Jenkins – you can do that quickly, so that, in half a day, you will be up and running. And let LiveRebel automate your deployment processes and all the automated tasks that go with it. Right from tests to the staging environment to production -- all with zero downtime and with no impact on users currently using the system.” “So if you were to make the update right now and you had 100 users on your system, they would not even know this was happening. It would maintain their sessions and transfer them over to the new version, all in the background.”

    Read the article

  • Cisco 2900 series router - 3x 3g HWIC - Can you use the same subnet for each HWIC?

    - by Lance
    We host a site with a 2900 series router with 3x 3G-HWIC cards installed. It is hosted with telstra and plugs into our corporate WAN. The card authenticates against radius and advertises a route into the WAN for which subnet it routes for. We have always used the same advertised subnet on each. Telstra have advised us that this could be the cause of some drop out issues whereby some services will work for some people and not for others and are saying effectively that their system will only use one of these at a time even though we can see the interface is online and assigned a WAN IP address. Has anyone out there configured a multi HWIC setup before and if so are they using different subnets for each or the same?

    Read the article

  • Squid 2.7 in offline_mode yet tries to contact DNS servers to resolve addresses

    - by William C
    I installed Squid 2.7 to act as a web cache on my laptop, so that I can browse previously-visited sites when I don't have WiFi. Except http_access allow all, I've made no changes to the default squid.conf configuration. When I turn offline_mode ON and disconnect from the Internet, and then I visit sites, I encounter The following error was encountered: Unable to determine IP address from host name for whatever.sitename.com The dnsserver returned: Timeout on any site I visit. What settings do I need to add to squid.conf so I can browse sites offline?

    Read the article

  • Outlook can not recognize PDF v1.7 attachments - those become corrupted after receiving on linux client

    - by SkyRaT
    MS Outlook cannot recognize PDF format 1.7 when sending it as an attachment. Therefore it's sent as: Content-Type: application/pdf; Content-Transfer-Encoding: quoted-printable When receiving such an e-mail under Linux (Thunderbird), the PDF content is being parsed as a plain text and converted. This results in a corrupted file loosing all the bytes 0x0a (LF) which are being removed by the EOL conversion. It's definitely a problem of Outlook which is IMO hard to fix and deploy. Is there a way to fix that on Thunderbird's site?

    Read the article

  • Can the JVM(Oracle) run into an OutOfMemory error if the heap size is below the max?

    - by user439407
    I am running a Tomcat site(with an NGinx front end) that seems to be randomly running out of memory even though the max heap size is pretty large. My question is is it possible for the JVM to get an OutOfMemory error even if the heap size is significantly less than -Xmx? For instance, here is a snapshot I took just 15 seconds before an OutOfMemory error: Tue Dec 18 23:13:28 JST 2012 Free memory: 162.31 MB Total memory: 727.75 MB Max memory: 3808.00 MB I guess theoretically it's possible that my code generated 3 gigs worth of objects in 15 seconds, but I highly doubt it. It seems like the JVM was unable to grow the heap even though it theoretically had room....Is it possible that other processes started using memory to the point that the JVM could not grow? I am running 64-bit Oracle Hotspot on a 64 bit vm running CentOS 5 with 6 gigs of ram.

    Read the article

  • is it okay to use random URLs instead of passwords?

    - by stew
    Is it considered "safe" to use URL constructed from random characters like this? http://example.com/EU3uc654/Photos I'd like to put some files/picture galleries on a webserver that are only to be accessed by a small group of users. My main concern is that the files should not get picked up by search-engines or curious power-users that poke around my site. I've set up an .htaccess file, just to notice that clicking on http://user:pass@url/ links doesn't work well with some browsers/email clients, prompting dialogs and warnings messages that confuse my not-too-computer-savy users.

    Read the article

  • web sites leaking memory? IIS 7.5 Windows server 2008 R2

    - by Charles
    I have several web sites on my windows 2008 server that have been working flawlessly for over a year. Just a few days ago I ran into an issue where my server stopped serving up pages on some of these sites for no apparent reason. I dug into it a little more today and I see that some of my sites (they're all asp.net mvc 3.0 sites), are consuming over 460MB of memory. Like I said, this just started the other day after a very long period of time of no issues at all. I have two questions: 1) is there a way to throttle how much memory is consumed by the w3wp process before I can force it to restart (restart the app pool for a particular site) so that it doesn't keep hogging all of the memory? 2) any ideas what could have caused this to start happening?

    Read the article

  • Make download dialog show up for pictures in IIS 6.0

    - by LinuxGnut
    I have a client that is offering picture downloads from their site. They want the user to have to download the picture, rather than having it appear in the browser when the link is clicked. Inserting the text "Right-click to save as" or something similar isn't an option with this client, as everything needs to be done their way. Now, I know I could accomplish this task in PHP or ASP, but I'd rather not have to write an addition to Magento to accomplish this. So is there a way in IIS 6.0 (Server 2003) to send the correct headers for image formats in that it will make a download dialog pop up?

    Read the article

  • QueryUnit 0.0.0.8 – Trust No One

    - by Davide Mauri
    Yesterday I’ve release an updated version of QueryUnit, the version 0.0.0.8. QueryUnit now supports AreNotEqual, Greater, and Less assertions and is more capable of managing strings results. I must say that I cannot live anymore without a proper Unit Testing of a BI solution. Just yesterday happened that one of the unit tests at a customer site failed showing a subtle situation where the release of a new version of custom application would have corrupted the source of BI data with a very low chance that someone would have noticed it before several days. It may happen when you have more the 15 systems that handles the data needed by your BI solution. The key message of this situation is “Trust No One”: if your data hasn’t passed quality testing it’s not trustable. Period. QueryUnit is now officialy an hero :) No superpowers still, but useful above all. http://queryunit.codeplex.com/ Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Is There a Cloud Over OpenWorld?

    - by Tony Berk
    If you have been to OpenWorld in the past, you know it can be overwhelming or at least a bit "large." If this is your first time at OpenWorld, get ready! You are in for a big (or I should say HUGE) treat. The first thing you'll notice when you get to San Francisco is there are a lot of people, buses with "Oracle" posters, large exhibit halls filled with demos, games and tchotchkes from vendors with hot new solutions, and then there are the sessions. Yes, in fact there are over 2000 sessions. How can you possibly sort through 2000 sessions to find the best 20 or so for you? Which are the 1% for you? We will try to help with some insight over the next few weeks.  I'm going start at the highest level. Up in the Clouds! Since I know many people are looking for an update on The Oracle Cloud. We will drill down into the cloud and other topics for CRM and Customer Experience sessions in the next set of posts. Below is a list of some of the Oracle executive keynotes during OpenWorld highlighting The Oracle Cloud and applications related topics (the full list is here). In these sessions you will get details on Oracle's strategy and how Oracle is changing the industry to help our customers be more efficient, effective and innovative. Sunday, September 30 6:00pm - 7:00pm Larry Ellison: Hardware and Software, Engineered to Work Together: Why it's a Different Approach Tuesday, October 2 8:45am - 9:45am Thomas Kurian: The Oracle Cloud: Oracle's Cloud Platform and Application's Strategy Tuesday, October 2 3:30pm - 4:30pm Larry Ellison: The Oracle Cloud: Where Social is Built in Thursday, October 4 9:45am - 10:45am Mark Hurd: See More, Act Faster: Oracle Business Analytics We encourage you to also join the keynotes on the Oracle Database and Cloud Infrastructure and the fascinating partner keynotes, as well. Check the full list on the OpenWorld site. Oh, if you haven't registered yet, what are you waiting for? OpenWorld Registration Details.

    Read the article

  • How to remove an entry from Chrome's Remembered URLs from the url bar?

    - by cmcculloh
    I've got a url in Chrome "local.mysite.com" that autopopulates when I start typing "local.my" into the URL bar. Note that this URL DOES NOT EXIST in my browser history (at chrome://history/#e=1&p=0) because it isn't a real site and therefor couldn't ever be successfully visited and therefor never shows up in my history. The URL I want is "local.mysite.com/subdir/". That URL is like 3 down in the suggested results because I keep accidentally hitting "enter" when it auto-suggests the unwanted first URL and thus re-enforcing it's assumption that that is the one I want. How do I get rid of the "local.mysite.com" entry in Chrome's memory?

    Read the article

  • How to trigger a check for updates in Firefox programatically or from a command line?

    - by Triynko
    Is there a command line switch for firefox.exe or an "about:" URL that will either force an update check or at least display the Help/About dialog, which checks for updates and tells if you're running the latest version? One site claimed that the "about:" URL was the same as menu Help - About, but it's not. I built a program to automate the updating of various programs on my machine, and most programs have command line tools for checking for updates. Windows update has wuauclt.exe, Java has jucheck.exe. For some applications, I can even automate the interface, but it's difficult in Firefox, because the main window title is unpredictable (it depends on which web page is active), and all Firefox windows seem to use the exact same window class name.

    Read the article

  • 25 years old and considering a career change...possible? practical?

    - by mq330
    Hi all, I'm new to this site and new to programming as well. I've spent some time going through an intro cs book that uses python as the language of choice. I find the exercises interesting and engaging and I generally have had a favorable experience programming so far. I've gone through some of the basics with python like writing simple programs, basics of GUIs, manipulating strings, lists, defining functions, etc. And I've always loved technology. Although I've never done any real hardcore programming yet, I was inclined to building websites from a very young age but I never really developed my skills. Now, the thing is I'm 25, I have my bacholors in environmental studies and two masters degrees in urban planning and landscape architecture respectively. I know, it would be quite a departure to pursue a career in programming at this point. Currently, I'm working as a geographic information systems intern. I've taken some GIS classes and have a lot of experience with making maps, doing spatial analysis etc. So what I'm thinking is maybe I can learn some solid programming skills and apply these skills in the field of GIS. From what I've seen, .net languages are the norm in this arena. Could you perhaps provide some guidance to me in terms of what languages I should focus on or courses I should take at this point? What about for building web mapping applications? Also, I was thinking about getting a certificate in programming from a university extension program. Do you think it would be worth it? And furthermore, do you think potential employers would be interested in hiring someone like me (once I get a couple of languages down pretty well) as an intern or in an entry level position? I'll be living in the bay area so I feel that there should be decent opportunities even though I don't have a b.s. in cs.

    Read the article

< Previous Page | 672 673 674 675 676 677 678 679 680 681 682 683  | Next Page >