Search Results

Search found 31417 results on 1257 pages for 'site structure'.

Page 51/1257 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • Visual Studio unable to open Web site error

    - by Jan-Frederik Carl
    Hello, I work with Visual Studio 2008 and work on a web project which contains a web site. When opening the solution file, I receive the error message: "Unable to open the Web site http://localhost/myWebsite.de. The Web site http://localhost/myWebsite.de does not exist." I can see the web site greyed out, with the remark "unavailable", in the solution folder. It is possible to add the web site to the solution, but after relaunching Visual Studio, it´s gone again. Has anyone had this problem in a similar way?

    Read the article

  • Sharepoint and Cross-Site Lookup

    - by Mina Samy
    Hi all I have this scenario I want to build two sharepoint 2007 sites. One for customers info and the other for products and customers orders. Now the problem is that in the second site I need to reference the customers info from the first site but unfortunately sharepoint doesnot provide out of the box cross-site lookup. I did some search and found custom cross-site fields and used one but when I upgraded the site to sharepoint 2010 this custom field was not compatible and the upgrade wizard said it could not be upgraded. so what is the solution for this ? is it to merge the two sites so that I can use the standard lookup feature or is there any workaround for this ? please if any body has faced such a scenario, share the solution with me ? thanks

    Read the article

  • Web Site in solution where "Rebuild Solution" compile succeeds cannot launch debugger

    - by fordareh
    I have a solution that includes a Web Site (created using the web site template not the web app project template - converting isn't an option, btw). When I rebuild all, the compile succeeds, but strangely displays 3 errors, all of which are "Could not get dependencies for project reference 'PROJNAME'". When I try to launch the debugger, I get the "There were build errors." dialogue. Two questions: If I choose the 'Yes' option in the debug error dialogue to run the last successful build, will it run on the code that my Rebuild All just compiled? How do I resolve this issue? I checked this post and am disheartened by my prospects. What is strange, though, is that I added these same projects to a separate web site solution that compiled/debugged fine, removed the test web site and re-added the target website I would like to debug, and it failed in the same manner. Is there a secret web site .proj file for .NET web sites? http://stackoverflow.com/questions/863379/could-not-get-dependencies-for-project-reference

    Read the article

  • django cross-site reverse a url

    - by tutuca
    I have a similar question than django cross-site reverse. But i think I can't apply the same solution. I'm creating an app that lets the users create their own site. After completing the signup form the user should be redirected to his site's new post form. Something along this lines: new_post_url = 'http://%s.domain:9292/manage/new_post %site.domain' logged_user = authenticate(username=user.username, password=user.password) if logged_user is not None: login(request, logged_user) return redirect(new_product_url) Now, I know that "new_post_url" is awful and makes babies cry so I need to reverse it in some way. I thought in using django.core.urlresolvers.reverse to solve this but that only returns urls on my domain, and not in the user's newly created site, so it doesn't works for me. So, do you know a better/smarter way to solve this?

    Read the article

  • IIS load balancing and site deployment

    - by KLC
    Hi, currently I have a site sits on one IIS7 server. When we deploy a new version of the site, we bring the site down and display an offline page. What I really want is have two same exact copies of the site sits in one IIS 7 server and load balance users among both servers. when we deploy a new version of the site, we will bring site1 down (users in site1 automatically routes to site2 on next postback), when site1 deployment is complete, bring site2 down (users in site2 being routes to site1 on next postback). is this even possible?

    Read the article

  • MOSS: Creating site templates from publishing sites

    - by nav
    Hi, On my MOSS site I am trying to save a publishing site as a site template. Then create subsites from this template. I am able to sucessfully create the site template and it is populated in the site template gallery. Following these instructions.. http://blah.winsmarts.com/2007-7-All_you_ever_wanted_to_know_about_SharePoint_2007_Site_Templates.aspx But when I try and create a subsite from this template, an error message is displayed stating: The template you have chosen is invalid or cannot be found. at Microsoft.SharePoint.Library.SPRequestInternalClass.ApplyWebTemplate(String bstrUrl, String& bstrWebTemplate, Int32& plWebTemplateId) at Microsoft.SharePoint.Library.SPRequest.ApplyWebTemplate(String bstrUrl, String& bstrWebTemplate, Int32& plWebTemplateId) Any help would be very appreciated, Thanks Nav

    Read the article

  • Tool to Verify Site URLs/SiteMap?

    - by LockeCJ
    I'm moving a site from one e-commerce software to another, and I've created URL Rewriter rules to do 301 redirects from the Old URLs to the new ones. I've tested them with a small sample of URLs, but I'm looking for some sort of tool that will let me test as many of the URLs as possible. Does anyone know of a tool that I can feed a list of URLs (or a sitemap.xml). This tool will attempt to retrieve each URL, and then report the status code for each. The result should be a list of URLs with the status code, something like this: www.site.com/oldurlformat1/ 301 Permanently Moved www.site.com/newurlformat1/ 200 OK www.site.com/oldurlformat2/ 301 Permanently Moved www.site.com/newurlformat2/ 200 OK I can almost do this with wget, but getting the summary/report at the end is where I'm stuck.

    Read the article

  • one site or many?

    - by Alex
    I have about 10-12 websites (main site is classic ASP, others are ASP.NET 2). Each site has his own virtual directory. They are related to each other, mainly from main site other sites are calling to perform some service. Each site has from 2 to 5 pages. Does it make sense to unite them and create one bigger site with one virtual directory and one project in VS? Or leave them as they are separately? What are pro and contras?

    Read the article

  • Facebook Site Linking Support

    - by Seth
    When someone posts a link to a web site in Facebook, it populates the link preview box with a photo and some text from the site. If someone posts a link to my site in Facebook, it is generally just get the site's domain name and one of the images that appears on the site. No text appears. I would like to be able to control what text and images appear in the link. Is there a specification that they use? Can I provide some metadata so Facebook will display what I want?

    Read the article

  • e-commerce clothing site in .NET from the scratch or site copy ? suggestions please

    - by pointlesspolitics
    Hi all, I am thinking to create an e-commerce site for a clothing business. I have bit of a development knowledge in c# .net, html, css and other web technologies. Could you please advise on these questions ? Is it a good idea to create a site from the scratch ? I have a running example of such a site, however it is bit old. So can I start with old site and change the html and code behind ? Some people argue that it is better to use MVC for ecommerce site. Any good reasons ? BTW I don't know MVC, but can learn if I have to use it. It is decided to use .NET but if you have any ideas of better technologies, please let me know. Any ideas and issues related to payment system also welcome. Any answer will be appreciated. Thanks Rgds,

    Read the article

  • How to work around a site forbidding me to scrape their images with PHP

    - by Petruza
    I'm scraping a site, searching for JPGs to download. Scraping the site's HTML pages works fine. But when I try getting the JPGs with CURL, copy(), fopen(), etc., I get a 403 forbiden status. I know that's because the site owners don't want their images scraped, so I understand a good answer would be just don't do it, because they don't want you to. Ok, but let's say it's ok and I try to work around this, how could this be achieved? If I get the same URL with a browser, I can open the image perfectly, it's not that my IP is banned or anything, and I'm testing the scraper one file at a time, so it's not blocking me because I make too many requests too often. From my understanding, it could be that either the site is checking for some cookies that confirm that I'm using a browser and browsing their site before I download a JPG. Or that maybe PHP is using some user agent for the requests that the server can detect and filter out. Anyway, have any idea?

    Read the article

  • Is a backlink with a duplicate description and title from a news site bad for SEO?

    - by Dejan Pelzel
    I have a blog with over a thousand posts. I have posted some of those to a news aggregator site and included the same preview photo and description that I used for it on my own site and the link to the post on my site. Since the site is mainly videos and images, the description was usually a complete match of 4-6 lines of text. It now looks that I have been affected by panda and since I am not doing any bad stuff, I suspect it might be due to duplicate content. For example, when I search the title of my posts, sometimes my site is not even returned, but the news aggregator site is. Could this be the problem with panda?

    Read the article

  • Can third party content on sub-domains harm the main site's search rankings?

    - by dror
    I have a site that is a "portal" or "directory" for service providers. We opened every service provider's own page on our site, but now we get a lot of applications from those providers that want sites from their own. We want to make a full site for every service provider, but rather put them on sub domain URLs. (They don’t mind, it's OK for them.) So, my site is www.exaple.com Their site will be: provider.example.com Now I have two questions: Can the content on the provider sites harm my site in SEO? If one from those sub domains is punished by Google because the owner does "black hat SEO", how it will affect the rood domain? Can it make the root domain get punished?

    Read the article

  • Should Site Title be Before or After Page Title?

    - by NickAldwin
    Possible Duplicate: Does the order of keywords matter in a page title? Apologies if this is a dupe. I tried searching, but didn't find anything specifically addressing this concern. When creating a large(ish) site, page titles usually reference both the site name and the current page name. However, it seems there are two main conventions: Bob's Awesome Site - Contact Page and Contact Page - Bob's Awesome Site I've looked around, and pages usually use one of the two variants above. Is there any reason to use one over the other? SEO/readability/usability/etc? I've thought about it, and have only come up with: Page first - Differentiates the tab when the browser is crowded with lots of tabs Site first - Immediately see the "parent" site, so to speak; more cohesive experience

    Read the article

  • Should Site Title be Before or After Page Title?

    - by NickAldwin
    Apologies if this is a dupe. I tried searching, but didn't find anything specifically addressing this concern. When creating a large(ish) site, page titles usually reference both the site name and the current page name. However, it seems there are two main conventions: Bob's Awesome Site - Contact Page and Contact Page - Bob's Awesome Site I've looked around, and pages usually use one of the two variants above. Is there any reason to use one over the other? SEO/readability/usability/etc? I've thought about it, and have only come up with: Page first - Differentiates the tab when the browser is crowded with lots of tabs Site first - Immediately see the "parent" site, so to speak; more cohesive experience

    Read the article

  • I want to search premier inn by sending a postcode to the site

    - by Mick
    Hi I want to send a postcode from my site to premier inns and return the hotels in the area , does anyone know how I can go about it please ? if there is a method for finding a search string from a site can anybody share please http://www.premierinn.com/en/homeQuickSearch!execute.action+ postcode ??? Any help would be great thanks Mick

    Read the article

  • IIS6 all websites displays another site when using https

    - by Lisa
    I have the following websites set up in iis 6. site1.com site2.com site3.com Accessing site1 is via the address https://site1.com. Accessing site2 and site three should be through http. When I try to access https://site2.com it displays the website of https://site1.com. How can I stop this. I either want an error or rediericting to the http site. Any help would be great.

    Read the article

  • Site crawler/spider that tosses results into mysql

    - by ian.evans
    It's been suggested that we use mysql for our site's search as it'd be running on the same server that hosts our web server (nginx) and our db (mysql). Since not all of our pages are created from the database, it's been suggested that we have a crawler that can crawl the site, and toss the page url and data into mysql and have sphinx index on that. Does anyone know of an open source spider that has a mysql storing option out of the box. Thanks.

    Read the article

  • Web Site Serving, Cloud-Computing, oh, my

    - by Frank
    I'm planning a software based service. To give it a bit of context (type of traffic), assume it similar to facebook in nature (with a little GitHub thrown in). I've been trying to understand my different hosting options. I've been using a shared host with GoDaddy for years just fine. I currently host a Wordpress web site there and I've not had any problems. Quite frankly, they've taken good care of me. However, the nature of a shared hosting environment is limited in nature. For example, I can't do anything but host a web site there. For example, I can not run a Mercurial server. Last time I attempted to build a web application with the intention of eventually launching it via GoDaddy, I ran in to all sorts of troubles because it was shared-hosted. Assembly issues, etc. At the time, the cost and time sank my project. (The lack of direct access was also frustrating.) (to be fair to godaddy, this was over 3 years ago) I've been looking at Rackspace or Amazon as a possible cloud solution but it seems to be just processing power and bandwidth (and an OS). From what I understand, I'd need to get Apache and MySQL Working on my own. The way cloud hosting is priced, however, seems appealing. I figure my final option might be to use a virtual private host. I think this would be more flexible than a shared-host site but less scalable than a cloud based server. So, I guess my question is what is an appropriate solution for someone who intends to build a web application service? I figure that I need to establish a hosting environment now rather than later so I can plan to effectively use the environment. I'd prefer to be fairly economical to start out with. I really can't afford to pay $999 (or even $99) while I build up the site and get the core functionality online but at the same time, I'd like to have the selected environment grow as needed. Thank you.

    Read the article

  • Duplicate IIS web site with Web Deploy

    - by gsantovena
    I have a Win2008 server with IIS 7 and I want to duplicate one web site and just change the binding port and the application pool that is using, so I will have 2 web sites (locally or remote) with same configuration but listening on different ports. Is there a way to do this with web deploy tool ir order to deploy locally and remotely this unique web site and change the binding ports in the destination?

    Read the article

  • Test a site with a static subdomain locally

    - by bcmcfc
    How can I test a site that uses one or more static domains for serving images locally? e.g. domain.tld with images servered from static.domain.tld Local working copy of the site on WAMP checked out from SVN: URLs will be pointing at static.domain.tld rather than static.domain.local

    Read the article

  • Slow ASP.NET site using IIS6

    - by lars
    Hi, I have two servers, one virtual and one physical, running the exact same site on both machines. Both machines are running on ~ 2% CPU load and very much RAM available for usage. Somehow the site, with cache turned off ofcourse, loads in ~ 500ms on the virtual machine (which is a dev-server by the way) but almost 3 full seconds on the physical machine. They're both running Server 2003, IIS6 aswell as asp.net version 2.0 Any ideas where I can start troubleshoot this? Best Regards LP

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >