Search Results

Search found 108959 results on 4359 pages for 'ado net data services'.

Page 189/4359 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • asp.net mvc asp.net

    - by mazhar
    I want to Develop a similar sort of wizard mechanism for the design like in the picture above .How would I do that. Please reply how would i code the above mechanism . At this time I am completely blank so any idea will be great.

    Read the article

  • .NET: understanding web.config in asp.net

    - by mark smith
    Hi there, Does anyone know of a good link to explain how to use the web.config...... For example, i am using forms authentication... and i notice there is a system.web and then it closed /system.web and then below configuration there are additional location tags here is an example, if you ntoice there is an authentication mode=forms with authorization i presume this is the ROOT....... It is also self contained within a system.web .... Below this there are more location= with system.web tags.... I have never really understand what i am actually doing.. I have tried checkign the MSDN documentation but still i don't fully understand up.... Can anyone help? If you notice with my example.... everything is stored in 1 web.config... i thought the standard waas create a standard web.config and then create another web.config in the directory where i wish to protect it..??? <configuration> <system.web> <compilation debug="true" strict="false" explicit="true" targetFramework="4.0" /> <authentication mode="Forms"> <forms loginUrl="Login.aspx" defaultUrl="Login.aspx" cookieless="UseCookies" timeout="60"/> </authentication> <authorization> <allow users="*"/> </authorization> </system.web> <location path="Forms"> <system.web> <authorization> <deny users="?"/> <allow users="*"/> </authorization> </system.web> </location> <location path="Forms/Seguridad"> <system.web> <authorization> <allow roles="Administrador"/> <deny users="?"/> </authorization> </system.web> </location>

    Read the article

  • Recommendations for colocation in the US

    - by Emil
    Hello serverfault I work for a European media company and we are currently looking for colocation in the US. I know the European market quite well unfortunately that is not the case for the US. I'm hoping for you guys to help me out a bit with a few questions, it would be much appreciated! I am looking for a data center that can deliver a high level of availability (tier 3 or better). The installation will be fairly large so capacity is important. Good internet connectivity/carrier presence. However most important is good customer support, skilled dedicated and responsive technical staff, since we won't have tech staff close by. I'm looking for a small and fast moving company that target internet businesses rather than big old enterprise hosting. What locations should we go for given that we want to reach all of the US from a single site and still maintain decent latency? (do we need east and west coast?) Where are the main internet hubs and should you try and get as close as possible? Are there any good online resources I should look at? Where do the large scale internet/media services colocate? Lastly I would be very happy to get some actual recommendations for companies to talk to P.S I'm happy to return the favor if anyone has question regarding data centers and colocation in Europe.

    Read the article

  • ASP.NET MVC.NET JQueryUI datepicker inside a div loaded/updated with ajax.actionlink

    - by ArjanW
    Im trying to incorporate jqueryUI's datepicker inside a partialview like this: <% using (Ajax.BeginForm("/EditData", new AjaxOptions { HttpMethod = "POST", UpdateTargetId = "div1" })) {%> Date: <%= Html.TextBox("date", String.Format("{0:g}", Model.date), new { id = "datePicker"})%> <% } %> <script type="text/javascript"> $(function() { $("#datePicker").datepicker(); }); </script> when i directly call the url to this partial view, so it renders only this view the datepicker works perfectly. (for the purpose of testing this i added the needed jquerry and jquerryui script and css references directly to the partial view) But if i use a Ajax.Actionlink to load this partial view inside a div (called div2, submitting the above form should update div1) like this: <div id="div1"> <%= Ajax.ActionLink("Edit", "/EditData", new { id = Model.id }, new AjaxOptions { HttpMethod = "GET", UpdateTargetId = "div2" } )%> </div> <div2>placeholder for the form</div> the datepicker wont apear anymore. My best guess is the javascript included in the loaded html doesnt get executed, ($(document).ready(function() { $("#datepicker").datepicker(); }); doesnt work either if that's the case how and where should i call the $("datepicker").datepicker(); ? (putting it in the ajaxoptions of the ajax.actionlink as oncomplete = "$(function() { $('#datepicker').datepicker();});" still doesnt work. if thats not the case, then where's my problem?

    Read the article

  • VB.net Net after load event?

    - by themaninthesuitcase
    I need some way of knowing when a form has finished loading. My reasoning is I have a 2nd form that is loaded when this form loads. The code for this is called from form1.load. Form2 is currently being displayed behind form1 as I am guessing form1 calls an activate or similar at the end of the load so any Activate, BringToFront etc calls on from2 are over ridden. If you look at the code below I have tried adding frmAllocationSearch.Activate, frmAllocationSearch.BringToFront and Me.SendToBack after the call to ShowAlloactionSearchDialog() but these are all wasted as something is happening after the load event is fired to bring Me to the front. Code is: Private Sub Allocation_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load ShowAlloactionSearchDialog() End Sub Private Sub ShowAlloactionSearchDialog() If frmAllocationSearch Is Nothing OrElse frmAllocationSearch.IsDisposed Then frmAllocationSearch = New AllocationSearch frmAllocationSearch.MdiParent = Me.MdiParent frmAllocationSearch.Info = Me.Info frmAllocationSearch.Top = Me.Top frmAllocationSearch.Left = Me.Left + Me.Width - frmAllocationSearch.Width frmAllocationSearch.AllocationWindow = Me frmAllocationSearch.Show() Else If frmAllocationSearch.WindowState = FormWindowState.Minimized Then frmAllocationSearch.WindowState = FormWindowState.Normal frmAllocationSearch.Activate() End If End Sub

    Read the article

  • Why does deploying a .NET Compact Framework assembly cause .NET Desktop Framework assemblies to be d

    - by Matthew Belk
    I am trying to get one of my developers set up to work on a fairly large .NETCF project. When we try to simply deploy the solution and all of its projects to a target device, deploying one of the projects triggers several assemblies from the desktop framework to be copied from the GAC to the device. What on earth could cause this? The assemblies from the "big" framework are ones like System.DirectoryServices, System.Design, and a bunch of others.

    Read the article

  • Silverlight (RIA Services) spontaneous culture changing

    - by Marc Wittke
    My RIA enabled Silverlight Application is setting the thread culture in the App constructor (this is absolutley okay since it is an intranet application and will never ever be used by someone who is not german): public App() { InitializeComponent(); Thread.CurrentThread.CurrentCulture = new CultureInfo("de-DE"); } It does what it should, the DataForms are displaying datetime values in german notation. BUT: it is spontaneously changing to en-US notation when navigating between items in the data source that is bound to the DataForm. Why?

    Read the article

  • Label,Asp.net ,C#.net.

    - by viahal
    I have a label and a text box associated to it . I have added some text in text box which is invisible at first... now I want to display the content after I go On the label... just help Me out..

    Read the article

  • Wordpress like dynamic permalinks in ASP.NET MVC2/3 or ASP.NET 4.0

    - by Aseem Gautam
    Scenario: There are two entities say 'Books' and 'Book Reviews'. There can be multiple books and each book can have multiple reviews. Each review and book should have a separate permalink. Books and Reviews can be added by users using separate input forms. As soon as any book/review is added it should be accessible by its permalink. Anyone can point me in the right direction on how should this be implemented?

    Read the article

  • ASP.NET binding object to Request in asp.net mvc

    - by Alxandr
    I've created a object that I'd like to have accessible from wherever the request-object is accessible, and to "die" with the request, more or less like how you always in a mvc-application has access to the RouteData-collection. Especially it's important that I have access to this object in the execution of action-filters. And also there need to be created a new object of my class whenever a new request is made to the page (the object needs to be request-safe, ie. only one request modifies that one object). Any thoughts about how to achieve this?

    Read the article

  • Controller not handle request and page not found problem in ASP.NET MVC

    - by Richa Media and services
    1 in my asp.net mvc project my controller not handle request he says that "page not found". but page is located in folder correctly. 2 i have a problem in second page who is handle by controller who is worked correct but problem jghkdfghkfhgjfh in temprary asp.net files. i remove it. but he comes when i redebug our project. 1 My problem is why my controller not handle request. 2 How to remove asp.net temprary files because when i re start debug he comes and give me error.

    Read the article

  • Load a 6 MB binary file in a SQL Server 2005 VARBINARY(MAX) column using ADO/VC++?

    - by Feroz Khan
    How to load a binary file(.bin) of size 6 MB in a varbinary(MAX) column of SQL Server 2005 database using ADO in a VC++ application. This is the code I am using to load the file which I used to load a .bmp file: BOOL CSaveView::PutECGInDB(CString strFilePath, FieldPtr pFileData) { //Open File CFile fileImage; CFileStatus fileStatus; fileImage.Open(strFilePath,CFile::modeRead); fileImage.GetStatus(fileStatus); //Alocating memory for data ULONG nBytes = (ULONG)fileStatus.m_size; HGLOBAL hGlobal = GlobalAlloc(GPTR,nBytes); LPVOID lpData = GlobalLock(hGlobal); //Putting data into file fileImage.Read(lpData,nBytes); HRESULT hr; _variant_t varChunk; long lngOffset = 0; UCHAR chData; SAFEARRAY FAR *psa = NULL; SAFEARRAYBOUND rgsabound[1]; try { //Create a safe array to store the BYTES rgsabound[0].lLbound = 0; rgsabound[0].cElements = nBytes; psa = SafeArrayCreate(VT_UI1,1,rgsabound); while(lngOffset<(long)nBytes) { chData = ((UCHAR*)lpData)[lngOffset]; hr = SafeArrayPutElement(psa,&lngOffset,&chData); if(hr != S_OK) { return false; } lngOffset++; } lngOffset = 0; //Assign the safe array to a varient varChunk.vt = VT_ARRAY|VT_UI1; varChunk.parray = psa; hr = pFileData->AppendChunk(varChunk); if(hr != S_OK) { return false; } } catch(_com_error &e) { //get info from _com_error _bstr_t bstrSource(e.Source()); _bstr_t bstrDescription(e.Description()); _bstr_t bstrErrorMessage(e.ErrorMessage()); _bstr_t bstrErrorCode(e.Error()); TRACE("Exception thrown for classes generated by #import"); TRACE("\tCode= %08lx\n",(LPCSTR)bstrErrorCode); TRACE("\tCode Meaning = %s\n",(LPCSTR)bstrErrorMessage); TRACE("\tSource = %s\n",(LPCSTR)bstrSource); TRACE("\tDescription = %s\n",(LPCSTR)bstrDescription); } catch(...) { TRACE("***Unhandle Exception***"); } //Free Memory GlobalUnlock(lpData); return true; } But when I read the same file using Getchunk function it gives me all 0s but the size of the file I get is same as the one uploaded. Your help will be highly appreciated.

    Read the article

  • Are IIS services closed after some time?

    - by mafutrct
    I'd like to host a WCF web service in IIS. The service should keep a certain set of data all the time, it must never be lost. My colleague told me this is impossible because IIS closes down the service after a certain time (I assume without any activity). Is that true? How do I prevent that behavior? In case it matters, both IIS 6 and 7 are available.

    Read the article

  • Soapui mocking services which return json

    - by eeek
    Microsoft Ajax can expose webservices which respond with json or xml depending on configuration. I would like to mock these services using soap ui. Using the wsdl I can do this to mock the services in the case where xml is returned, however how can I mock the response when JSON is returned?

    Read the article

  • Control button Post Back when click it in ASP.NET with VB.NET

    - by Lefteris Gkinis
    In my Default.aspx web page i use the following code in order to save some Attributes to an xml file <asp:Button ID="Button1" Text="Save Attributes" CssClass="Button01" runat="server" OnClientClick="SaveAttributesButton_Click(); return, false;" > </asp:Button> when i Click on the button that starts to make postback all the code right from the page load I have already use the HTML Button but this need a java script which is run when the page load And sub without my command Please is there anybody which can assist me on this issue?

    Read the article

  • Model View Presenter plus ASP.NET Web Service; where does the asmx live?

    - by Dirk
    I've been slowly transitioning from a traditional web forms architecture to the MVP pattern (Passive View). So far, it's been fairly easy to implement b/c the views I've dealt with have all employed a classic PostBack model. However, I've come across my first view that will refresh portions of itself via web services. I can't grok where the web service should live (Presenter I think) or how to expose that asmx end point to my View while still maintaining the clean separation of concerns/testability that MVP affords me. I've searched far and wide for some examples on how this might be implemented and have come up with nothing. Please help!

    Read the article

  • ASP.NET MVC & Windsor.Castle: working with HttpContext-dependent services

    - by Igor Brejc
    I have several dependency injection services which are dependent on stuff like HTTP context. Right now I'm configuring them as singletons the Windsor container in the Application_Start handler, which is obviously a problem for such services. What is the best way to handle this? I'm considering making them transient and then releasing them after each HTTP request. But what is the best way/place to inject the HTTP context into them? Controller factory or somewhere else?

    Read the article

  • ASP.NET ~ remotely invoking an ASP.NET page

    - by eponymous23
    I have two servers running IIS, say, Server-A and Server-B. Server-A is in the DMZ, visible to all users; Server-B is not in the DMZ. I need to provide a way for users to invoke a page on Server-A which will in turn remotely request a page on Server-B, transparently to the user. In other words, Server-A needs to do this on behalf of the user because the user does not have visibility to Server-B. Is this possible and if so, what is the best method to do this?

    Read the article

  • I need to create a web service using .NET-- where should I start?

    - by j-t-s
    Hi All I am trying to implement a feature. But I've never had anything to do with "Web Services" before, other than using them. I have a desktop application, and I want that application to be able to sort of "post" some information (i.e. email address, username, user-selected options (just plain text) etc) to an application or "web service" on my ASP.NET web server. Can somebody please guide me in the right direction? How would I accomplish this? Thank you :) Jason

    Read the article

  • Consuming REST based web services in .Net

    - by steve
    Greetings, I'm confused as to the best approach to make when consuming REST based web services with .Net. At the moment I'm using the System.net.webclient class. Should I be using Webresponse, webrequest classes in System.Net ? If I were to use another approach (Other than webclient) what disadvantages / advantages would I gain ? Thanks,

    Read the article

  • Using Node.js as an accelerator for WCF REST services

    - by Elton Stoneman
    Node.js is a server-side JavaScript platform "for easily building fast, scalable network applications". It's built on Google's V8 JavaScript engine and uses an (almost) entirely async event-driven processing model, running in a single thread. If you're new to Node and your reaction is "why would I want to run JavaScript on the server side?", this is the headline answer: in 150 lines of JavaScript you can build a Node.js app which works as an accelerator for WCF REST services*. It can double your messages-per-second throughput, halve your CPU workload and use one-fifth of the memory footprint, compared to the WCF services direct.   Well, it can if: 1) your WCF services are first-class HTTP citizens, honouring client cache ETag headers in request and response; 2) your services do a reasonable amount of work to build a response; 3) your data is read more often than it's written. In one of my projects I have a set of REST services in WCF which deal with data that only gets updated weekly, but which can be read hundreds of times an hour. The services issue ETags and will return a 304 if the client sends a request with the current ETag, which means in the most common scenario the client uses its local cached copy. But when the weekly update happens, then all the client caches are invalidated and they all need the same new data. Then the service will get hundreds of requests with old ETags, and they go through the full service stack to build the same response for each, taking up threads and processing time. Part of that processing means going off to a database on a separate cloud, which introduces more latency and downtime potential.   We can use ASP.NET output caching with WCF to solve the repeated processing problem, but the server will still be thread-bound on incoming requests, and to get the current ETags reliably needs a database call per request. The accelerator solves that by running as a proxy - all client calls come into the proxy, and the proxy routes calls to the underlying REST service. We could use Node as a straight passthrough proxy and expect some benefit, as the server would be less thread-bound, but we would still have one WCF and one database call per proxy call. But add some smart caching logic to the proxy, and share ETags between Node and WCF (so the proxy doesn't even need to call the servcie to get the current ETag), and the underlying service will only be invoked when data has changed, and then only once - all subsequent client requests will be served from the proxy cache.   I've built this as a sample up on GitHub: NodeWcfAccelerator on sixeyed.codegallery. Here's how the architecture looks:     The code is very simple. The Node proxy runs on port 8010 and all client requests target the proxy. If the client request has an ETag header then the proxy looks up the ETag in the tag cache to see if it is current - the sample uses memcached to share ETags between .NET and Node. If the ETag from the client matches the current server tag, the proxy sends a 304 response with an empty body to the client, telling it to use its own cached version of the data. If the ETag from the client is stale, the proxy looks for a local cached version of the response, checking for a file named after the current ETag. If that file exists, its contents are returned to the client as the body in a 200 response, which includes the current ETag in the header. If the proxy does not have a local cached file for the service response, it calls the service, and writes the WCF response to the local cache file, and to the body of a 200 response for the client. So the WCF service is only troubled if both client and proxy have stale (or no) caches.   The only (vaguely) clever bit in the sample is using the ETag cache, so the proxy can serve cached requests without any communication with the underlying service, which it does completely generically, so the proxy has no notion of what it is serving or what the services it proxies are doing. The relative path from the URL is used as the lookup key, so there's no shared key-generation logic between .NET and Node, and when WCF stores a tag it also stores the "read" URL against the ETag so it can be used for a reverse lookup, e.g:   Key Value /WcfSampleService/PersonService.svc/rest/fetch/3 "28cd4796-76b8-451b-adfd-75cb50a50fa6" "28cd4796-76b8-451b-adfd-75cb50a50fa6" /WcfSampleService/PersonService.svc/rest/fetch/3    In Node we read the cache using the incoming URL path as the key and we know that "28cd4796-76b8-451b-adfd-75cb50a50fa6" is the current ETag; we look for a local cached response in /caches/28cd4796-76b8-451b-adfd-75cb50a50fa6.body (and the corresponding .header file which contains the original service response headers, so the proxy response is exactly the same as the underlying service). When the data is updated, we need to invalidate the ETag cache – which is why we need the reverse lookup in the cache. In the WCF update service, we don't need to know the URL of the related read service - we fetch the entity from the database, do a reverse lookup on the tag cache using the old ETag to get the read URL, update the new ETag against the URL, store the new reverse lookup and delete the old one.   Running Apache Bench against the two endpoints gives the headline performance comparison. Making 1000 requests with concurrency of 100, and not sending any ETag headers in the requests, with the Node proxy I get 102 requests handled per second, average response time of 975 milliseconds with 90% of responses served within 850 milliseconds; going direct to WCF with the same parameters, I get 53 requests handled per second, mean response time of 1853 milliseconds, with 90% of response served within 3260 milliseconds. Informally monitoring server usage during the tests, Node maxed at 20% CPU and 20Mb memory; IIS maxed at 60% CPU and 100Mb memory.   Note that the sample WCF service does a database read and sleeps for 250 milliseconds to simulate a moderate processing load, so this is *not* a baseline Node-vs-WCF comparison, but for similar scenarios where the  service call is expensive but applicable to numerous clients for a long timespan, the performance boost from the accelerator is considerable.     * - actually, the accelerator will work nicely for any HTTP request, where the URL (path + querystring) uniquely identifies a resource. In the sample, there is an assumption that the ETag is a GUID wrapped in double-quotes (e.g. "28cd4796-76b8-451b-adfd-75cb50a50fa6") – which is the default for WCF services. I use that assumption to name the cache files uniquely, but it is a trivial change to adapt to other ETag formats.

    Read the article

  • ASP.NET MVC 3 Hosting :: Error Handling and CustomErrors in ASP.NET MVC 3 Framework

    - by C. Miller
    So, what else is new in MVC 3? MVC 3 now has a GlobalFilterCollection that is automatically populated with a HandleErrorAttribute. This default FilterAttribute brings with it a new way of handling errors in your web applications. In short, you can now handle errors inside of the MVC pipeline. What does that mean? This gives you direct programmatic control over handling your 500 errors in the same way that ASP.NET and CustomErrors give you configurable control of handling your HTTP error codes. How does that work out? Think of it as a routing table specifically for your Exceptions, it's pretty sweet! Global Filters The new Global.asax file now has a RegisterGlobalFilters method that is used to add filters to the new GlobalFilterCollection, statically located at System.Web.Mvc.GlobalFilter.Filters. By default this method adds one filter, the HandleErrorAttribute. public class MvcApplication : System.Web.HttpApplication {     public static void RegisterGlobalFilters(GlobalFilterCollection filters)     {         filters.Add(new HandleErrorAttribute());     } HandleErrorAttributes The HandleErrorAttribute is pretty simple in concept: MVC has already adjusted us to using Filter attributes for our AcceptVerbs and RequiresAuthorization, now we are going to use them for (as the name implies) error handling, and we are going to do so on a (also as the name implies) global scale. The HandleErrorAttribute has properties for ExceptionType, View, and Master. The ExceptionType allows you to specify what exception that attribute should handle. The View allows you to specify which error view (page) you want it to redirect to. Last but not least, the Master allows you to control which master page (or as Razor refers to them, Layout) you want to render with, even if that means overriding the default layout specified in the view itself. public class MvcApplication : System.Web.HttpApplication {     public static void RegisterGlobalFilters(GlobalFilterCollection filters)     {         filters.Add(new HandleErrorAttribute         {             ExceptionType = typeof(DbException),             // DbError.cshtml is a view in the Shared folder.             View = "DbError",             Order = 2         });         filters.Add(new HandleErrorAttribute());     }Error Views All of your views still work like they did in the previous version of MVC (except of course that they can now use the Razor engine). However, a view that is used to render an error can not have a specified model! This is because they already have a model, and that is System.Web.Mvc.HandleErrorInfo @model System.Web.Mvc.HandleErrorInfo           @{     ViewBag.Title = "DbError"; } <h2>A Database Error Has Occurred</h2> @if (Model != null) {     <p>@Model.Exception.GetType().Name<br />     thrown in @Model.ControllerName @Model.ActionName</p> }Errors Outside of the MVC Pipeline The HandleErrorAttribute will only handle errors that happen inside of the MVC pipeline, better known as 500 errors. Errors outside of the MVC pipeline are still handled the way they have always been with ASP.NET. You turn on custom errors, specify error codes and paths to error pages, etc. It is important to remember that these will happen for anything and everything outside of what the HandleErrorAttribute handles. Also, these will happen whenever an error is not handled with the HandleErrorAttribute from inside of the pipeline. <system.web>  <customErrors mode="On" defaultRedirect="~/error">     <error statusCode="404" redirect="~/error/notfound"></error>  </customErrors>Sample Controllers public class ExampleController : Controller {     public ActionResult Exception()     {         throw new ArgumentNullException();     }     public ActionResult Db()     {         // Inherits from DbException         throw new MyDbException();     } } public class ErrorController : Controller {     public ActionResult Index()     {         return View();     }     public ActionResult NotFound()     {         return View();     } } Putting It All Together If we have all the code above included in our MVC 3 project, here is how the following scenario's will play out: 1.       A controller action throws an Exception. You will remain on the current page and the global HandleErrorAttributes will render the Error view. 2.       A controller action throws any type of DbException. You will remain on the current page and the global HandleErrorAttributes will render the DbError view. 3.       Go to a non-existent page. You will be redirect to the Error controller's NotFound action by the CustomErrors configuration for HTTP StatusCode 404. But don't take my word for it, download the sample project and try it yourself. Three Important Lessons Learned For the most part this is all pretty straight forward, but there are a few gotcha's that you should remember to watch out for: 1) Error views have models, but they must be of type HandleErrorInfo. It is confusing at first to think that you can't control the M in an MVC page, but it's for a good reason. Errors can come from any action in any controller, and no redirect is taking place, so the view engine is just going to render an error view with the only data it has: The HandleError Info model. Do not try to set the model on your error page or pass in a different object through a controller action, it will just blow up and cause a second exception after your first exception! 2) When the HandleErrorAttribute renders a page, it does not pass through a controller or an action. The standard web.config CustomErrors literally redirect a failed request to a new page. The HandleErrorAttribute is just rendering a view, so it is not going to pass through a controller action. But that's ok! Remember, a controller's job is to get the model for a view, but an error already has a model ready to give to the view, thus there is no need to pass through a controller. That being said, the normal ASP.NET custom errors still need to route through controllers. So if you want to share an error page between the HandleErrorAttribute and your web.config redirects, you will need to create a controller action and route for it. But then when you render that error view from your action, you can only use the HandlerErrorInfo model or ViewData dictionary to populate your page. 3) The HandleErrorAttribute obeys if CustomErrors are on or off, but does not use their redirects. If you turn CustomErrors off in your web.config, the HandleErrorAttributes will stop handling errors. However, that is the only configuration these two mechanisms share. The HandleErrorAttribute will not use your defaultRedirect property, or any other errors registered with customer errors. In Summary The HandleErrorAttribute is for displaying 500 errors that were caused by exceptions inside of the MVC pipeline. The custom errors are for redirecting from error pages caused by other HTTP codes.

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >