Search Results

Search found 27295 results on 1092 pages for 'cross site'.

Page 141/1092 | < Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >

  • Server cost/requirements for a web site with thousands of concurrent users?

    - by Angelus
    I'm working on a big project, and I do not have much experience with servers and how much they cost. The big project consist of a new table game for online playing and betting. Basically, a poker server that must be responsive with thousands of concurrent users. What type of server must i look for? What features, hardware or software, are required? Should I consider cloud computing? thank you in advance.

    Read the article

  • Is there a tool that allows site users to schedule meetings with each other? [closed]

    - by Andrew Min
    I'm the webmaster for a debate team, and we're trying to find a tool that allows us to have multiple team members say when they're available and see who else is available during those timeslots for one-on-one practice rounds. I suppose we could use something like Doodle, but that would involve recreating the Doodle every week. There are many scheduling tools available, but they're usually built so that you can sign up to meet with a specific individual (think a doctor or a professor's office hours), whereas you could be paired with ANY individual.

    Read the article

  • How do I get Paypal or a merchant account for a marketplace style web site?

    - by Brett G
    I'm having trouble getting approved for a merchant account for my website. Basically I have expert users and users. Expert users provide a service through my website which they set their own rates. Users purchase the services, then pay me, I give 90% to the expert users. I have been told this is factoring.. Is the way around this, a system like freelancer.com does? Where users deposit money into their freelancer account, then pay for the services they won? What are the negatives to this system? What about sites like 99designs? They accept CC payments and then pay the winning designer. How are some sites doing this but I'm having so much trouble getting approved?

    Read the article

  • How to create robots.txt for a domain that contains international websites in subfolders?

    - by aaandre
    Hi, I am working on a site that has the following structure: site.com/us - us version site.com/uk - uk version site.com/jp - Japanese version etc. I would like to create a robots.txt that points the local search engines to a localized sitemap page and has them exclude everything else from the local listings. So, google.com (us) will index ONLY site.com/us and take in consideration site.com/us/sitemap.html google.co.uk will index only site.com/uk and site.com/uk/sitemap.html Same for the rest of the search engines, including Yahoo, Bing etc. Any idea on how to achieve this? Thank you!

    Read the article

  • DFS - Stop sync of large folder that has since been removed

    - by g18c
    We have a site to site DFSR on Windows Server 2008 R2 that has been running perfectly between site A to site B until someone dumped a 20GB folder. This has overwhelmed the upload and make the internet almost useless at site A (the upload is low at the branch office). We have removed this folder from the DFS share on site A, however the internet is still really slow. Is there any way to cancel this sync or other way to get DFSR back in to a happy state?

    Read the article

  • Which web site gives the most accurate indication of a programmer's capabilities?

    - by Jerry Coffin
    If you were hiring programmers, and could choose between one of (say) the top 100 coders on topcoder.com, or one of the top 100 on stackoverflow.com, which would you choose? At least to me, it would appear that topcoder.com gives a more objective evaluation of pure ability to solve problems and write code. At the same time, despite obvious technical capabilities, this person may lack any hint of social skills -- he may be purely a "lone coder", with little or no ability to help/work with others, may lack mentoring ability to help transfer his technical skills to others, etc. On the other hand, stackoverflow.com would at least appear to give a much better indication of peers' opinion of the coder in question, and the degree to which his presence and useful and helpful to others on the "team". At the same time, the scoring system is such that somebody who just throws up a lot of mediocre (or even poor answers) will almost inevitably accumulate a positive total of "reputation" points -- a single up-vote (perhaps just out of courtesy) will counteract the effects of no fewer than 5 down-votes, and others are discouraged (to some degree) from down-voting because they have to sacrifice their own reputation points to do so. At the same time, somebody who makes little or no technical contribution seems unlikely to accumulate a reputation that lands them (even close to) the top of the heap, so to speak. So, which provides a more useful indication of the degree to which this particular coder is likely to be useful to your organization? If you could choose between them, which set of coders would you rather have working on your team?

    Read the article

  • Force www. on multi domain site and retain http or https

    - by John Isaacks
    I am using CakePHP which already contains an .htaccess file that looks like: <IfModule mod_rewrite.c> RewriteEngine on RewriteRule ^$ app/webroot/ [L] RewriteRule (.*) app/webroot/$1 [L] </IfModule> I want to force www. (unless it is a subdomain) to avoid duplicate content penalties. It needs to retain http or https Also This application will have multiple domains pointing to it. So the code needs to be able to work with any domain.

    Read the article

  • How to set up Google DFP (DoubleClick for Publishers) for a site?

    - by Manoj Kumar
    I have a website and I have an AdSense account as well. I have integrated AdSense and ads are also getting displayed (480 x 60). Somewhere I read that I can manage the ads that are being shown in my website (480 x 60) and filter out the ads on a CPM/CPC basis. NOTE: I don't have any ads to be displayed on other's websites. I just want to show other's ads on my website. Now, can I use Google DFP to manage the ads? I mean is Google DFP useful for me to filter the ads and get me more revenue?

    Read the article

  • EMC/Legato/Networker Failed to recover files : Cross Platform Recovery not supported.

    - by marc.riera
    Software used to backup: EMC / Legato Networker legato server : windows legato clients: same hardware (2 years ago fedora something , now ubuntu ) Trying to recover from an old client, which is no longer available. So this is the thing. On 07/20/2008 we backed up a samba server(fedora something) to a tape , setting 1 year as browse policy and retention policy. Now this tape is recyclable. We took down the dns name. We deleted the legato client configuration. That legato client was reinstalled and is doing other stuff on ubuntu 10.04, with a different name but same ip. Now, 2 years and some month later #### Now we need to recover a folder from 2008 backup, on the fedora-samba-server. First thing, legato does not show the client name because the config was deleted. We create it again. We just set the old dns back on track, pointing the same ip, where the old server was, same MAC address ;). We created a new 'old client configuration' pointing to the new server. (different legato ip for client "I suppose" ) The ssid where the needed folder is on 2 tapes, 20 and 22. The index for that backup is on tape 21. We put this tapes on the jukebox (IBMT4000) -- not important for the issue -- All three tapes expired its browsable and recoverable time. So they are on recyclable. We get the clone id from the ssid with following command: mminfo -avot -q "ssid=<ssid>" -r cloneid We set the tapes to notrecyclable nsrmm -S <ssid>/<cloneid> -o notrecyclable We change the retention for the tapes for a future date nsrmm -S <ssid> -e 01/20/2011 We check the dates are correct : mminf -avV -q "ssid=<ssid>" -r ssbrowse(26),ssretent(26),savetime So far its OK. We close the terminal. Restart the server, just for being sure. Finally, we recover the index for that ssid where the folder should be. nsrck -L7 -t "07/20/2008" oldservername.domain.org There, we open the Networker User, select the server, select the old client as source, select the new client as destination. And this is what I get. imgur image of output -- http://i.imgur.com/1nOr8.png Should I understand that I need to install whatsoever operating system that was running on the old "linux server"/"networker client" to be able to restore 26Mb of files? thanks

    Read the article

  • How much time do you spend in this site with work on your desk?

    - by David Conde
    This is a simple question, because I want to see if there is a collective tendency to do this: How much time do you spend snooping around in StackOverflow, Programmers, and all those heavenly sites even when having work on the desk? Are these sites becoming facebook, in our case? I used to enter facebook from time to time... it's been like 20 days now and I dont even care about it. All I want to see is what are the latest post of my geeky friends around here..! Is this collective?

    Read the article

  • How can I force Google to re-index my site?

    - by Matthias
    I changed the structure of my URLs. The pages are already indexed by Google and have the following structure: http://mypage.com/myfolder/page.apsx The new structure is: http://mypage.com/page.aspx Now all URLs that Google knows are wrong. How can I tell Google to re-index and that the structure has changed? Internally I redirect in ASP.NET when the URL contains myfolder by I want Google to update the URLs. Thanks for the answers - I use IIS 6 and I do not know how to configure a redirect of all pages that contains the folder to page one folder below. So I did the trick in the Begin_Request method and did a Context.Response.Redirect. This is no 301 redirect, only a redirect done with ASP.NET via code. Will this also do the trick so that Google notices that the URL /folder/page1.aspx now is redirected to /page1.aspx?

    Read the article

  • Why would a web site keep my signup information for a limited time only?

    - by Alois Mahdal
    I have just created account at (some web service, well, actually it was Transifex, a localization service). Registration form requested typical things: accont name, e-mail adress, password (twice), and, optional company name and phone number. What confused me was this sentence on confirmation page (the one right after submitting the form): We will store your signup information for 7 days on our server. Can anybody explain what does this mean? What exactly they are referring to by "signup information", if it's something that should be kept for only 7 days? Or is my account going to be destroyed after that time? (Well, that could make sense for some special services, but not for this one.)

    Read the article

  • How long does it take for Google Webmasters to index site after submitting sitemap? [closed]

    - by Venkatesh Hodavdekar
    Possible Duplicate: Why isn't my website in Google search results? I have submitted my website today into Google search using Google Webmasters using sitemaps. The status on the sitemap says OK and it shows that 12 urls have been recognized. I was wondering how long does it take for the link to get indexed, as the indexed url option says "No data available. Please check back soon." I am not sure if it is showing this message due to some error, or everything is fine.

    Read the article

  • Is an app that does nothing but link to a web site functional enough to meet Apple's iOS guidelines?

    - by Pointy
    I don't hang out on Programmers enough to know whether this question is "ok", so my apologies if not. I tried to make the title obvious so at least it can be closed quickly :-) The question is simple. My employer wants "home screen presence" (or at least the possibility thereof) on iOS devices (also Android but I'm mostly interested in Apple at the moment). Our actual application will be a pure web-delivered mobile-friendly application, so what we want on the homescreen is basically something that just acts as a link to bring up Safari (or Chrome now I guess; not important). I'm presuming that that's more-or-less possible; if not then that would be interesting too. I know that the Apple guidelines are such that low-functionality apps are generally rejected out of hand. There are a lot of existing apps that seem (to me) less functional than a link to something useful, but I'm not Apple of course. Because this seems like a not-too-weird situation, I'm hoping that somebody knows it's either definitely OK (maybe because there are many such apps) or definitely not OK. Note that I know about things like PhoneGap and I don't want that, at least not at the moment.

    Read the article

  • How do I prevent my ASP .NET site from continually prompting for user credentials?

    - by gilles27
    I'm trying to get an ASP .NET website up and running on IIS6. The site will run in its own application pool, and uses Windows authentication, with anonymous access turned off. When I run the app pool under NETWORK SERVICE, everything works fine. However we need the app pool to run under a different account, because this account needs some extra privileges (we are printing Word documents). This new account is a member of the local users group, and the IIS_WPG group. It has also been granted the "Log on as a service right". When I browse to the site I am prompted for credentials, not once, but several times. When the page finally loads it looks wrong because the style sheets have not been applied. My suspicion is that I am being prompted once for each file (e.g. all the images, styles and script files) the browser requests, and that for some reason the website is unable to validate those credentials in order to serve the files back. If I allow anonymous access the page loads fine - we don't want to allow it but I mention it in case it offers any further clues. My theory is that perhaps the account the app pool runs under needs permissions to validate domain credentials? If that is so, how do I enable this?

    Read the article

  • Does the EU cookie law apply to an EU site that is hosted outside of the EU?

    - by mickburkejnr
    I have been reading up about this EU cookie law, and have also had in depth conversations with my girlfriend who is a solicitor/lawyer and with colleagues while building websites. While we are now working towards implementing a way to abide by the EU law, I have thought of something which no one really knows the answer to and has caused a few arguments. It's my understanding that any website in the EU must abide by these cookie laws, which is understandable. However, say if I were to have a .co.uk or .eu domain name pointing to a website which is hosted in America for example, do I still need to abide by the EU laws even though the website is hosted outside of the EU? One person I have asked has said that because the domain name is .co.uk or .eu (a European TLD) then the website is still accountable under EU law. Another person I have asked has said because the actual website is held outside of the EU, it doesn't actually have to bother with this law.

    Read the article

  • How can I determine the trending pages on my site?

    - by Dogweather
    I'm looking to what what the "hot" pages are on one of my sites. I want to see for various timeframes, what the top-50 pages are. I'm going to create a data feed with this info which will be input to another app. I have Apache logs, and complete control of the machine to install what I want. I'm mostly wondering if there's something out there already that I can use, or if I have to implement it myself, what good algorithms or strategies might be. Thanks.

    Read the article

  • Apple intègre un code JavaScript sur son site pour cacher les excuses qu'il doit adresser à Samsung par décision de justice

    « Samsung ne copie pas l'iPad », Apple est contraint d'afficher ce message par voie publicitaire Suite à une décision de justice britannique L'une des batailles juridiques opposant Apple à Samsung pour copie du look-and-feel de l'iPad vient de livrer son verdict. La firme à la pomme vient de perdre le procès intenté contre Samsung sur le territoire britannique sur le design de la Galaxy Tab qui serait trop similaire à celui de l'iPad. [IMG]http://ftp-developpez.com/gordon-fowler/GalaxyTab.jpg[/IMG] Galaxy Tab Le juge britannique Colin Birss a statué que Samsung n'avait pas enfreint les brevets invoqués par Apple, et que les...

    Read the article

  • How do I control how often search engines visit my site?

    - by Nick
    I've been using the following line in the <head> of my sites for years: <meta name="revisit-after" content="3 days" /> I recently discovered that it's not one of the meta tags that Google understands, which I take to mean that there's no point in including it, and that it's been doing no good at all for years. How often do search engines crawl a website by default, and what reliable ways are there to increase or decrease that frequency?

    Read the article

  • What are some efficient ways to set up my environment when working on a remote site?

    - by Prefix
    Hello fellow Programmers, I am still a relatively new programmer and have recently gotten my first on-campus programming position. I am the sole dev responsible for 8 domains as well as 3 small sized PHP web apps. The campus has its web environment divided into staging and live servers -- we develop on the staging via SFTP and then push the updates to the live server through a web GUI. I use Sublime Text 2 and the Sublime SFTP plugin currently for all my dev work (its my preferred editor). If I am just making an edit to a page I'll open that individual file via the ftp browser. If I am working on the PHP web app projects, I have the app directory mapped to a local folder so that when I save locally the file is auto-uploaded through Sublime SFTP. I feel like this workflow is slow and sub-optimal. How can I improve my workflow for working with remote content? I'd love to set up a local environment on my machine as that would eliminate the constant SFTP upload/download, but as I said there are many sites and the space required for a local copy of the entire domain would be quite large and complex; not to mention keeping it updated with whatever the latest on the staging server is would be a nightmare. Anyone know how I can improve my general web dev workflow from what I've described? I'd really like to cut out constantly editing over FTP but I'm not sure where to start other than ripping the entire directory and dumping it into XAMP.

    Read the article

  • How can IIS 7.5 have the error pages for a site reset to the default configuration?

    - by Sn3akyP3t3
    A mishap occurred with web.config to accommodate a subsite existing. I made use of “<location path="." inheritInChildApplications="false">”. Essentially it was a workaround put in place for nested web.config files which was causing a conflict. The result was that error pages were not being handled properly. Error 500 was being passed to the client for every type of error encountered. Removal of the offending inheritInChildApplications tag from the root web.config restored normal operations of most of the error handling, but for some reason error 503 is a correct response header, but the IIS server is performing the custom actions for error 403.4 which is a redirect to https. I'm looking to restore defaults for error pages so that the behavior once again is restored. I then can re-add customizations for the error pages.

    Read the article

  • How to publish a page to two sites?

    - by George2
    Hello everyone, I am using SharePoint 2007 Enterprise + Publishing portal template + Windows Server 2008. I have a root site and a sub-site. I want to enable the following function -- when the sub-site administrator publishing a page, the administrator could select to publish to the sub-site only or publish to both root site and sub-site. Any ideas how to implement this? I am not sure whether there is any ready-to-use solution without writing code? thanks in advance, George

    Read the article

< Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >