Search Results

Search found 31417 results on 1257 pages for 'site structure'.

Page 82/1257 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • Django CSRF framework cannot be disabled and is breaking my site

    - by MikeN
    The django csrf middleware can't be disabled. I've commented it out from my Middleware of my project but my logins are failing due to missing CSRF issues. I'm working from the Django trunk. How can CSRF cause issues if it is not enabled in middleware? I have to disable it because there are lots of POST requests on my site that CSRF just breaks. Any feedback on how I can completely disable CSRF in a django trunk project? The "new' CSRF framework from Django's trunk is also breaking an external site that is coming in and doing a POST on a URL I'm giving them (this is part of a restful API.) I can't disable the CSRF framework as I said earlier, how can I fix this?

    Read the article

  • Logging Into a site that uses Live.com authentication

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow.

    Read the article

  • Django admin site populated combo box based on imput

    - by user292652
    hi i have to following model class Match(models.Model): Team_one = models.ForeignKey('Team', related_name='Team_one') Team_two = models.ForeignKey('Team', related_name='Team_two') Stadium = models.CharField(max_length=255, blank=True) Start_time = models.DateTimeField(auto_now_add=False, auto_now=False, blank=True, null=True) Rafree = models.CharField(max_length=255, blank=True) Judge = models.CharField(max_length=255, blank=True) Winner = models.ForeignKey('Team', related_name='winner', blank=True) updated = models.DateTimeField('update date', auto_now=True ) created = models.DateTimeField('creation date', auto_now_add=True ) def save(self, force_insert=False, force_update=False): pass @models.permalink def get_absolute_url(self): return ('view_or_url_name') class MatchAdmin(admin.ModelAdmin): list_display = ('Team_one','Team_two', 'Winner') search_fields = ['Team_one','Team_tow'] admin.site.register(Match, MatchAdmin) i was wondering is their a way to populated the winner combo box once the team one and team two is selected in admin site ?

    Read the article

  • Existing web-site CSS replacement (re-skinning) best-practices without changing the HTML

    - by Enigmativity
    I can see a number of other good answers to questions relating to CSS best-practices on stack overflow: How to Manage CSS Explosion CSS Conventions / Code Layout Models Are there any CSS standards that I should follow while writing my first stylesheet? What is the best method for tidying CSS? Best Practices - CSS Stylesheet Formatting But I think I have a different problem. I'm trying to "re-skin" an existing site that has been nicely built using div's and ul's, etc, and it has a good existing CSS file, but when I start making changes to the CSS I quickly find that I break the layout. My feeling is that it is very hard to get a feel for how all the CSS will work together and indeed what CSS is affecting parent and sibling elements in the HTML. So, my question is "what are the best-practices around re-skinning an existing web-site by replacing the CSS only and not modifying the existing HTML?" I can't change the classes, ids, node hierarchy, etc. An example of the particular site that I am trying to re-skin is http://demo.nopcommerce.com/. The existing CSS can be as complicated/detailed as this extract from the main CSS file: .header-selectors-wrapper { text-align: right; float: right; width: 500px; } .header-currencyselector { float: right; } .header-languageselector { float: left; } .header-taxDisplayTypeSelector { float: right; } .header-links-wrapper { float: right; text-align: right; width: 570px; } .header-links { border: solid 1px #9a9a9a; padding: 5px 5px 5px 5px; margin-bottom: 5px; display: inline-table; } .order-summary-content .cart .cart-item-row td, .wishlist-content .cart .cart-item-row td { border-bottom: 1px solid #c5c5c5; vertical-align: middle; line-height: 30px; } .order-summary-content .cart .cart-item-row td.product, .wishlist-content .cart .cart-item-row td.product { text-align: left; padding: 0px 10px 0px 10px; } .order-summary-content .cart .cart-item-row td.product a, .wishlist-content .cart .cart-item-row td.product a { font-weight: bold; } Any help would be appreciated.

    Read the article

  • Sending emails to site subscribers

    - by Art
    I have a site where people enter their email address to be reminded of a weekly posting. I only send them the weekly email that they double-opt-in for. I do not sell, trade, expose, or do anything else with their address. Right now I have a bit of a hack, connecting to a large email providers SMTP server to send the emails. As the site grows I fear that my emails will be marked as spam and that I will be violating the TOS of the provider. I'm considering two options, building an in-house SMTP server or using a mailing list service. Is there a best-known-method for this? I'd really prefer to simply build my email and use a script to send them through the SMTP server rather then use some GUI to build the email and import a list of addresses. But previous experience tells me that running a mail server and not being marked as spam can be difficult. Help!?

    Read the article

  • Bulk email notifications API for social networking site

    - by goombaloon
    I'm looking to build a social networking site that notifies followers of a particular user when new content has been posted by that user. Since the hope is that the site will eventually have thousands of users with thousands of followers, I'd prefer to use an outside product/service to send the emails rather than building the functionality into the application and dealing with scalability issues. I've looked at MailChimp and a few others out there. Since my use case isn't really email campaigns, these providers don't seem like the right solution. In fact, MailChimp support confirmed for me that my scenario doesn't fit for them. I'm sure others have run into this same requirement. What are others out there using? I'm using .NET so would like to target something that has a .NET wrapper if possible. Any advice would be most appreciated!

    Read the article

  • Tactics for using PHP in a high-load site

    - by Ross
    Before you answer this I have never developed anything popular enough to attain high server loads. Treat me as (sigh) an alien that has just landed on the planet, albeit one that knows PHP and a few optimisation techniques. I'm developing a tool in PHP that could attain quite a lot of users, if it works out right. However while I'm fully capable of developing the program I'm pretty much clueless when it comes to making something that can deal with huge traffic. So here's a few questions on it (feel free to turn this question into a resource thread as well). Databases At the moment I plan to use the MySQLi features in PHP5. However how should I setup the databases in relation to users and content? Do I actually need multiple databases? At the moment everything's jumbled into one database - although I've been considering spreading user data to one, actual content to another and finally core site content (template masters etc.) to another. My reasoning behind this is that sending queries to different databases will ease up the load on them as one database = 3 load sources. Also would this still be effective if they were all on the same server? Caching I have a template system that is used to build the pages and swap out variables. Master templates are stored in the database and each time a template is called it's cached copy (a html document) is called. At the moment I have two types of variable in these templates - a static var and a dynamic var. Static vars are usually things like page names, the name of the site - things that don't change often; dynamic vars are things that change on each page load. My question on this: Say I have comments on different articles. Which is a better solution: store the simple comment template and render comments (from a DB call) each time the page is loaded or store a cached copy of the comments page as a html page - each time a comment is added/edited/deleted the page is recached. Finally Does anyone have any tips/pointers for running a high load site on PHP. I'm pretty sure it's a workable language to use - Facebook and Yahoo! give it great precedence - but are there any experiences I should watch out for? Thanks, Ross

    Read the article

  • How do web crawlers affect site statistics?

    - by LM
    What are ways in which web crawlers (both from search engines and non-search engines) could affect site statistics (e.g., when doing AB-testing different page variations)? And what are ways to take care of these problems? For example: Do a lot of people writing web crawlers often delete their cookies and mask their IPs, so that web crawlers often show up as different users each time they crawl the site? What are heuristics to use to recognize that something is a bot? (I'm guessing any sophisticated enough bot can be indistinguishable from a real user, if it wants to -- is this correct?)

    Read the article

  • SEO chaos from changing robots.txt file in Wordpress site

    - by Seedorf
    Hi there, I recently edited the robots.txt file in my site using a wordpress plugin. However, since i did this, google seems to have removed my site from their search page. I'd appreciate if I could get an expert opinion on why this is so, and a possible solution. I'd initially done it to increase my search ranking by limiting the pages being accessed by google. This is my robots.txt file in wordpress: User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /category/*/* Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads Sitemap: http://www.instant-wine-cellar.co.uk/wp-content/themes/Wineconcepts/Sitemap.xml

    Read the article

  • Django admin site auto populate combo box based on input

    - by user292652
    hi i have to following model class Match(models.Model): Team_one = models.ForeignKey('Team', related_name='Team_one') Team_two = models.ForeignKey('Team', related_name='Team_two') Stadium = models.CharField(max_length=255, blank=True) Start_time = models.DateTimeField(auto_now_add=False, auto_now=False, blank=True, null=True) Rafree = models.CharField(max_length=255, blank=True) Judge = models.CharField(max_length=255, blank=True) Winner = models.ForeignKey('Team', related_name='winner', blank=True) updated = models.DateTimeField('update date', auto_now=True ) created = models.DateTimeField('creation date', auto_now_add=True ) def save(self, force_insert=False, force_update=False): pass @models.permalink def get_absolute_url(self): return ('view_or_url_name') class MatchAdmin(admin.ModelAdmin): list_display = ('Team_one','Team_two', 'Winner') search_fields = ['Team_one','Team_tow'] admin.site.register(Match, MatchAdmin) i was wondering is their a way to populated the winner combo box once the team one and team two is selected in admin site ?

    Read the article

  • AJAX is reloading page on a SharePoint site (SharePoint 2007 AJAX-enabled)

    - by Josh
    I have an AJAX-enabled SharePoint 2007 site. I have also created a user control that has an interactive ajax form. It obviuosly works like a charm locally, but I am trying to get it working on the SharePoint site. The problem is that once I load up the user control on to an aspx page inside SharePoint, the form (which has ajax), causes the page to reload every time a postback occurs. Can someone help point me in the direction of debugging this? - I really need to eliminate the page refreshes and have the ajax work correctly in SharePoint. I read that the ScriptManager has to be in the SharePoint masterpage, but that did not work either... Page still reloads everytime. Thanks.

    Read the article

  • Web site aggregation with twitter widget SSL issue

    - by AB
    Hello! I'm seeking for solution how to isolate widget included by partial to main site. Issue appear when user access site with https. Ie 6,7 shows security confirmation dialog (part of website resources are not in secure zone). First of all I download twitter widget on our side, also I download all CSS and pictures. Then I patched widget JS to point onto downloaded resources. But still has not luck with security warning :( I guess the reason of this issue is AJAX request to twitter, but there is no idea how to sole it. (Just to create some kind of proxy on our side). Thank you for attention.

    Read the article

  • Prevent IE users from visiting my site?

    - by Paul Hatcherian
    Internet Explorer has caused me a lot of trouble over the years, between security problems, memory leaks, endless CSS and JavaScript hacks to get my site to look correct, and inconsistencies between releases, I've spent countless hours as the hapless victim of IE's idiosyncrasies. Well that ends today, I've decided to take matters into my own hands and ban all users of IE from visiting my website. That will teach them to use such a cruddy browser. My question is how best to do this? I don't want to rely on JavaScript, which could be disabled, nor the request agent string, which could be tampered with. A clever user could even temporarily switch to Firefox or Chrome just to visit my site. Ideally, I'd have a list of the IP addresses of every IE user in the world and restrict based on the IP address. The main problem I'm having, aside from getting the list in the first place, is how do I keep it updated? Thanks!

    Read the article

  • SharePoint web part not displaying for site readers

    - by gregenslow
    Hello - I've built a custom SharePoint 2010 web part and deployed it to the home page of a publishing site. It's a very simple web part that just displays items from a SP list in a drop down list. The web part works fine if I'm logged in as a site owner or a member but not if I'm just a reader. The web part doesn't render at all to readers. I don't get any of the web part chrome or title, just nothing. I have other web parts (out-of-box ones) in the same zone that are displaying fine so it's not an issue of the whole zone not displaying. As a reader, I can still view the list directly so it doesn't appear to be a problem with list permissions. My web parts are being deployed as a farm solution, not sand-boxed and the assembly is being deployed to the GAC. I feel like I must be missing something simple here but I'm stumped. Help.

    Read the article

  • Embedding googleVis charts into a web site

    - by gd047
    Reading from the googleVis package vignette: "With the googleVis package users can create easily web pages with interactive charts based on R data frames and display them either via the R.rsp package or within their own sites". Following the instructions I was able to see the sample charts, using the plot method for gvis objects. This method by default creates a rsp-file in the rsp/myAnalysis folder of the googleVis package, using the type and chart id information of the object and displays the output using the local web server of the R.rsp package (port 8074 by default). Could anybody help me (or provide some link) on the procedure someone has to follow in order to embed such charts into an existing web site (e.g. a joomla site)?

    Read the article

  • Office documents prompt for login in anonymous SharePoint site

    - by xmt15
    I have a MOSS 07 site that is configured for anonymous access. There is a document library within this site that also has anonymous access enabled. When an anonymous user clicks on a PDF file in this library, he or she can read or download it with no problem. When a user clicks on an Office document, he or she is prompted with a login box. The user can cancel out of this box without entering a log in, and will be taken to the document. This happens in IE but not FireFox. I see some references to this question on the web but no clear solutions: http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?dg=microsoft.public.sharepoint.windowsservices.development&tid=5452e093-a0d7-45c5-8ed0-96551e854cec&cat=en_US_CC8402B4-DC5E-652D-7DB2-0119AFB7C906&lang=en&cr=US&sloc=&p=1 http://www.sharepointu.com/forums/t/5779.aspx http://www.eggheadcafe.com/software/aspnet/30817418/anonymous-users-getting-p.aspx

    Read the article

  • Restaurant review site in SharePoint

    - by Sisiutl
    I'm totally new to SharePoint but my company has adopted it for internal use. As a learning exercise I thought it would be fun to create a local restaurant review site. Can anyone point me to an example or tutorial to get me started? What I have in mind is a simple site where users can enter the restaurant name, cuisine style, address, a simple 5 point rating and price scale, and a short review. Other users could search for restaurants by cuisine and add their own reviews. We have SharePoint 2007 and MOSS. Thanks

    Read the article

  • Rails redirect_to jQTouch site does not work as expected

    - by tilthouse
    I have a Rails app with a jQTouch mobile site that is displayed if the user goes to m.blah.com. First, I detect the browser, then to a redirect_to m.blah.com if it's an iphone, etc. All well and good. When I use desktop Safari, this all works exactly right. However, when I use an actual iPhone or the Apple iPhone Simulator, it does not. The mobile site appears to load without the browser actually doing the redirect. The URL in the browser is still www. I am wondering if this behavior is due to Mobile Safari, or if it is somehow jQTouch trying to load the page with AJAX, not a reload (which is odd as jQTouch hasn't been loaded at all before the redirect). Any ideas?

    Read the article

  • Difference between 'Web Site' and 'Project' in Visual Studio

    - by Gudmundur Orn
    Duplicate http://stackoverflow.com/questions/344473/asp-net-website-or-web-application-project I have noticed that there is clearly a difference in what you get when you fire up Visual Studio 2008 and choose 'New Project' - 'ASP.NET Web Application' instead of 'New Web Site' - 'ASP.NET Web Site'. For example if you choose 'Project', then you can compile to .dll and each page gets a *.aspx.designer.cs codebehind file. 1) Why do we have these two different project types? 2) Which do you prefer? 3) Why would I choose one over the other? 4) What's the deal with the *.aspx.designer.cs files?

    Read the article

  • http/html/ajax: show result before site is completely processed

    - by chris
    hi! I am looking for a way to show in a webapp in front of a task a wait-message and after it hide the message. The task is running a longer time. I dont know if it is possible at all. The problem is, so far I can see, that the site will be returned to the users browser AFTER the task is completed because the task is part of the site as a inline code replaces by the webserver interpreter (no matter if php, perl or whatever). The only solution I can imagine is to parallel the task with threads or processes and requery the state with ajax in the website. Any idea to do it less complex? Thanks for help! regards chris

    Read the article

  • Sharepoint Site Administration

    - by Shrewd Demon
    Hey, I've got a SharePoint website running on my machine (which it shows me inside the Application Pool in the Inet Manager). Now this website has a different user credentials specified under the Identity section (properties). Also when I view the w3wp.exe in the task manager it shows that the site is running as a different user. The problem is that if I change the username and password of the existing user with mine, the site stops working. How do I run it under my account credentials. Please help. Thanks

    Read the article

  • MOSS 2007 team site page title

    - by nav
    Hi, I'm trying to display the page title (html title) on the default.aspx page of a custom site template. The template is based on a MOSS team site template. All that displays is the URL of the page as the page title. Can I change the code in the default.aspx and/or the sites master page to define the title myself? Details of the deafult.aspx and default.master page as below: Thanks. Default.aspx: <asp:Content ContentPlaceHolderId="PlaceHolderPageTitle" runat="server"> <SharePoint:EncodedLiteral runat="server" text="<%$Resources:wss,multipages_homelink_text%>" EncodeMethod="HtmlEncode"/> - <SharePoint:ProjectProperty Property="Title" runat="server"/> </asp:Content> default.master <Title ID=onetidTitle><asp:ContentPlaceHolder id=PlaceHolderPageTitle runat="server"/></Title>

    Read the article

  • Zend Framework ErrorController not working on live site?

    - by firecall
    My site is set up pretty much as per the Zend Quickstart guide. The default ErrorController works locally using MAMP. But now I've deployed it to my live site I get a blank page and a "500 Internal Server Error" according to FireBug when I go to an action that doesnt exist. On my local server I get a 404 and a nicely formatted error page. Any ideas anyone? I dont really know where to begin looking. I'm confused :/ Thanks.

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >