Search Results

Search found 52798 results on 2112 pages for 'sharepoint web services'.

Page 188/2112 | < Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >

  • Disabling the "save password" prompt in web browsers

    - by torrtruk
    hi guys. How do I disable the save password prompt in web browsers whenever I'm submitting a form. I've seen a few bank sites where this doesn't come up. Are they doing it through JS or there any http headers or HTML meta tags available to achieve this? I'm trying to do this for a mobile application. Please pour your ideas.

    Read the article

  • Web technologies exam query

    - by user288245
    I have a practical exam in web technologies so html, css, javascript and (possibly) php. I'm still learning and just wondered if you guys had any advice. HTML im fine with, css sometimes takes me time to get right and javascript and php get a bit messy! It's open book, so what would you suggest taking with me?

    Read the article

  • ASP.NET Navgiate to section on page (help files)

    - by samcooper11
    I am writing a help page with sections - What I want to do is allow navigation to a particular section (depending on what page they clicked help from) rather always landing at the top of the page. My web app is written in ASP.NET, Can anyone point me in the direction of how to set this up? Thanks!

    Read the article

  • What exactly is SEO friendly site?

    - by Tom
    Hey, So, I've seen web developers writing in their CV that they create "SEO friendly sites. ". Also I heard that Wordpress is SEO friendly site and other CMSs. So, what does SEO friendly site mean? I understand, that titles and URLs are probably the most important things for making good positions in google, but is there any other things which I should know? Thanks

    Read the article

  • Simple Java web application on Tomcat

    - by EugeneP
    If we only need to graphically authorize a user, view a few tables representation (from database), ability to change data in the database visually what tools to use to write such a web application that will run on Tomcat? What framework allows to do that in the most straightforward, easy-to-manage and elegant way?

    Read the article

  • Going from web browser to web request

    - by Patrick
    I am working on a program that automates tasks in a browser like entering text, clicking, etc and right now everything is working fine when using the Web Browser tool in Visual Studio 2010. What I'd like to know is how should I approach converting all of this so I can use send requests instead of the browser? I heard its a lot more efficient and a lot better if you are going to be using multi threading but I have so much code that already works now and am not sure how I should do this without scraping quite a bit of it.

    Read the article

  • Utilizing back and forward browser buttons for web apps

    - by at
    How do I use the back and forward browser buttons to navigate my javascript/jQuery web application? I know I can do window.location.href = 'this_url#placeholder' and then perform the appropriate action. But when I do that, the browser scrolls to the element with that placeholder id (or to the top if there wasn't one). How do I get the browser to not scroll anywhere? I've seen this on other sites like facebook, what's the appropriate mechanism for this?

    Read the article

  • Creating typed WSDL’s for generic WCF services of the ESB Toolkit

    - by charlie.mott
    source: http://geekswithblogs.net/charliemott Question How do you make it easy for client systems to consume the generic WCF services exposed by the ESB Toolkit using messages that conform to agreed schemas\contracts?  Usually the developer of a system consuming a web service adds a service reference using a WSDL. However, the WSDL’s for the generic services exposed by the ESB Toolkit do not make it easy to develop clients that conform to agreed schemas\contracts. Recommendation Take a copy of the generic WSDL’s and modify it to use the proper contracts. This is very easy.  It will work with the generic on ramps so long as the <part>?</part> wrapping is removed from the WCF adapter configuration in the BizTalk receive locations.  Attempting to create a WSDL where the input and output messages are sent/returned with a <part> wrapper is a nightmare.  I have not managed it.  Consequences I can only see the following consequences of removing the <part> wrapper: ESB Test Client – I needed to modify the out-of-the-box ESB Test Client source code to make it send non-wrapped messages.  Flat file formatted messages – the endpoint will no longer support flat file message formats.  However, even if you needed to support this integration pattern through WCF, you would most-likely want to create a separate receive location anyway with its’ own independently configured XML disassembler pipeline component. Instructions These steps show how to implement a request-response implementation of this. WCF Receive Locations In BizTalk, for the WCF receive location for the ESB on-ramp, set the adapter Message settings\bindings to “UseBodyPath”: Inbound BizTalk message body  = Body Outbound WCF message body = Body Create a WSDL’s for each supported integration use-case Save a copy of the WSDL for the WCF generic receive location above that you intend the client system to use. Give it a name that mirrors the interface agreement (e.g. Esb_SuppliersSearchCommand_wsHttpBinding.wsdl).   Add any xsd schemas files imported below to this same folder.   Edit the WSDL to import schemas For example, this: <xsd:schema targetNamespace=http://microsoft.practices.esb/Imports /> … would become something like: <xsd:schema targetNamespace="http://microsoft.practices.esb/Imports">     <xsd:import schemaLocation="SupplierSearchCommand_V1.xsd"                            namespace="http://schemas.acme.co.uk/suppliersearchcommand/1.0"/>     <xsd:import  schemaLocation="SuppliersDocument_V1.xsd"                              namespace="http://schemas.acme.co.uk/suppliersdocument/1.0"/>     <xsd:import schemaLocation="Types\Supplier_V1.xsd"                              namespace="http://schemas.acme.co.uk/types/supplier/1.0"/>     <xsd:import  schemaLocation="GovTalk\bs7666-v2-0.xsd"                               namespace="http://www.govtalk.gov.uk/people/bs7666"/>     <xsd:import  schemaLocation="GovTalk\CommonSimpleTypes-v1-3.xsd"                             namespace="http://www.govtalk.gov.uk/core"/>     <xsd:import  schemaLocation="GovTalk\AddressTypes-v2-0.xsd"                              namespace="http://www.govtalk.gov.uk/people/AddressAndPersonalDetails"/> </xsd:schema> Modify the Input and Output message For example, this: <wsdl:message name="ProcessRequestResponse_SubmitRequestResponse_InputMessage">   <wsdl:part name="part" type="xsd:anyType"/> </wsdl:message> <wsdl:message name="ProcessRequestResponse_SubmitRequestResponse_OutputMessage">   <wsdl:part name="part" type="xsd:anyType"/> </wsdl:message> … would become something like: <wsdl:message name="ProcessRequestResponse_SubmitRequestResponse_InputMessage">   <wsdl:part name="part"                       element="ssc:SupplierSearchEvent"                         xmlns:ssc="http://schemas.acme.co.uk/suppliersearchcommand/1.0" /> </wsdl:message> <wsdl:message name="ProcessRequestResponse_SubmitRequestResponse_OutputMessage">   <wsdl:part name="part"                       element="sd:SuppliersDocument"                       xmlns:sd="http://schemas.acme.co.uk/suppliersdocument/1.0"/> </wsdl:message> This WSDL can now be added as a service reference in client solutions.

    Read the article

  • Managing Operational Risk of Financial Services Processes – part 2/2

    - by Sanjeev Sharma
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} In my earlier blog post, I had described the factors that lead to compliance complexity of financial services processes. In this post, I will outline the business implications of the increasing process compliance complexity and the specific role of BPM in addressing the operational risk reduction objectives of regulatory compliance. First, let’s look at the business implications of increasing complexity of process compliance for financial institutions: · Increased time and cost of compliance due to duplication of effort in conforming to regulatory requirements due to process changes driven by evolving regulatory mandates, shifting business priorities or internal/external audit requirements · Delays in audit reporting due to quality issues in reconciling non-standard process KPIs and integrity concerns arising from the need to rely on multiple data sources for a given process Next, let’s consider some approaches to managing the operational risk of business processes. Financial institutions considering reducing operational risk of their processes, generally speaking, have two choices: · Rip-and-replace existing applications with new off-the shelf applications. · Extend capabilities of existing applications by modeling their data and process interactions, with other applications or user-channels, outside of the application boundary using BPM. The benefit of the first approach is that compliance with new regulatory requirements would be embedded within the boundaries of these applications. However pre-built compliance of any packaged application or custom-built application should not be mistaken as a one-shot fix for future compliance needs. The reason is that business needs and regulatory requirements inevitably out grow end-to-end capabilities of even the most comprehensive packaged or custom-built business application. Thus, processes that originally resided within the application will eventually spill outside the application boundary. It is precisely at such hand-offs between applications or between overlaying processes where vulnerabilities arise to unknown and accidental faults that potentially result in errors and lead to partial or total failure. The gist of the above argument is that processes which reside outside application boundaries, in other words, span multiple applications constitute a latent operational risk that spans the end-to-end value chain. For instance, distortion of data flowing from an account-opening application to a credit-rating system if left un-checked renders compliance with “KYC” policies void even when the “KYC” checklist was enforced at the time of data capture by the account-opening application. Oracle Business Process Management is enabling financial institutions to lower operational risk of such process ”gaps” for Financial Services processes including “Customer On-boarding”, “Quote-to-Contract”, “Deposit/Loan Origination”, “Trade Exceptions”, “Interest Claim Tracking” etc.. If you are faced with a similar challenge and need any guidance on the same feel free to drop me a note.

    Read the article

  • Embedding a CMS in an MVC Web App

    - by Mr Snuffle
    I'm working on a website for searching for businesses, then displaying a listing page. We've been toying with the idea of letting the clients manage their listing page using an external CMS. I'm not sure how often this is done, or if it's even best practice. Ideally, we want to be able to setup a listing on our website, then give the clients access to an external CRM when they can manage their listing page. We then want to embed this custom page within our website, possibly using an iframe (which will come along with it's own set of complications). We'd like this integration to be as seamless as possible. I'd personally prefer it if we could directly inject the HTML into our own page and bypass an iframe all together, but I don't know of any CMS hosting services that provide the interface for such a thing. We've experimented a little with Squarespace, and we can get a fairly clean version of someone's page which would be well suited for an iframe. I'm wondering if anyone else has looked and integrating an external hosting CMS into a website (in this case, we're using ASP.NET MVC). We'd also want to automate the creation of accounts on this external CMS, so when a user signed up we could just point them to the website with some login details. I have no idea if anyone offers a service like this, but any recommendations would be greatly appreciated. We could host a service ourself too, but the aim is to have an external system that clients can use to manage their pages. Cheers, James

    Read the article

  • How would a user stay logged in to a REST-based website?

    - by unforgiven3
    A year or so ago I asked this question: Can you help me understand this? “Common REST Mistakes: Sessions are irrelevant”. My question was essentially this: Okay, I get that HTTP authentication is done automatically on every message - but how? Is the username/password sent with every request? Doesn't that just increase attack surface area? I feel like I'm missing part of the puzzle. The answers I received made perfect sense in the context of a mobile (iPhone, Android, WP7) app - when talking to a REST service, the app would just send user credentials along with each request. That worked great for me. But now, I would like to better understand how one would secure a REST-like website, like StackOverflow itself or something like Reddit. How would things work if it was a user logged in via a web browser instead of logged in via an iPhone app? What happens when a user logs in? Are the credentials saved in the browser somehow? How would the browser know what credentials to send with subsequent REST requests? What if it's a JavaScript call to a webservice? How would the JavaScript call include user credentials? I'll be quite frank: my understanding of security when it comes to websites is pretty limited. I enjoyed working with REST services from an app perspective, but now I want to try and build a website that is based on REST principles, and I'm finding myself to be pretty lost. If there is anything in the above question that is unclear that you'd like me to clarify, please leave a comment and I'll address it.

    Read the article

  • West Wind WebSurge - an easy way to Load Test Web Applications

    - by Rick Strahl
    A few months ago on a project the subject of load testing came up. We were having some serious issues with a Web application that would start spewing SQL lock errors under somewhat heavy load. These sort of errors can be tough to catch, precisely because they only occur under load and not during typical development testing. To replicate this error more reliably we needed to put a load on the application and run it for a while before these SQL errors would flare up. It’s been a while since I’d looked at load testing tools, so I spent a bit of time looking at different tools and frankly didn’t really find anything that was a good fit. A lot of tools were either a pain to use, didn’t have the basic features I needed, or are extravagantly expensive. In  the end I got frustrated enough to build an initially small custom load test solution that then morphed into a more generic library, then gained a console front end and eventually turned into a full blown Web load testing tool that is now called West Wind WebSurge. I got seriously frustrated looking for tools every time I needed some quick and dirty load testing for an application. If my aim is to just put an application under heavy enough load to find a scalability problem in code, or to simply try and push an application to its limits on the hardware it’s running I shouldn’t have to have to struggle to set up tests. It should be easy enough to get going in a few minutes, so that the testing can be set up quickly so that it can be done on a regular basis without a lot of hassle. And that was the goal when I started to build out my initial custom load tester into a more widely usable tool. If you’re in a hurry and you want to check it out, you can find more information and download links here: West Wind WebSurge Product Page Walk through Video Download link (zip) Install from Chocolatey Source on GitHub For a more detailed discussion of the why’s and how’s and some background continue reading. How did I get here? When I started out on this path, I wasn’t planning on building a tool like this myself – but I got frustrated enough looking at what’s out there to think that I can do better than what’s available for the most common simple load testing scenarios. When we ran into the SQL lock problems I mentioned, I started looking around what’s available for Web load testing solutions that would work for our whole team which consisted of a few developers and a couple of IT guys both of which needed to be able to run the tests. It had been a while since I looked at tools and I figured that by now there should be some good solutions out there, but as it turns out I didn’t really find anything that fit our relatively simple needs without costing an arm and a leg… I spent the better part of a day installing and trying various load testing tools and to be frank most of them were either terrible at what they do, incredibly unfriendly to use, used some terminology I couldn’t even parse, or were extremely expensive (and I mean in the ‘sell your liver’ range of expensive). Pick your poison. There are also a number of online solutions for load testing and they actually looked more promising, but those wouldn’t work well for our scenario as the application is running inside of a private VPN with no outside access into the VPN. Most of those online solutions also ended up being very pricey as well – presumably because of the bandwidth required to test over the open Web can be enormous. When I asked around on Twitter what people were using– I got mostly… crickets. Several people mentioned Visual Studio Load Test, and most other suggestions pointed to online solutions. I did get a bunch of responses though with people asking to let them know what I found – apparently I’m not alone when it comes to finding load testing tools that are effective and easy to use. As to Visual Studio, the higher end skus of Visual Studio and the test edition include a Web load testing tool, which is quite powerful, but there are a number of issues with that: First it’s tied to Visual Studio so it’s not very portable – you need a VS install. I also find the test setup and terminology used by the VS test runner extremely confusing. Heck, it’s complicated enough that there’s even a Pluralsight course on using the Visual Studio Web test from Steve Smith. And of course you need to have one of the high end Visual Studio Skus, and those are mucho Dinero ($$$) – just for the load testing that’s rarely an option. Some of the tools are ultra extensive and let you run analysis tools on the target serves which is useful, but in most cases – just plain overkill and only distracts from what I tend to be ultimately interested in: Reproducing problems that occur at high load, and finding the upper limits and ‘what if’ scenarios as load is ramped up increasingly against a site. Yes it’s useful to have Web app instrumentation, but often that’s not what you’re interested in. I still fondly remember early days of Web testing when Microsoft had the WAST (Web Application Stress Tool) tool, which was rather simple – and also somewhat limited – but easily allowed you to create stress tests very quickly. It had some serious limitations (mainly that it didn’t work with SSL),  but the idea behind it was excellent: Create tests quickly and easily and provide a decent engine to run it locally with minimal setup. You could get set up and run tests within a few minutes. Unfortunately, that tool died a quiet death as so many of Microsoft’s tools that probably were built by an intern and then abandoned, even though there was a lot of potential and it was actually fairly widely used. Eventually the tools was no longer downloadable and now it simply doesn’t work anymore on higher end hardware. West Wind Web Surge – Making Load Testing Quick and Easy So I ended up creating West Wind WebSurge out of rebellious frustration… The goal of WebSurge is to make it drop dead simple to create load tests. It’s super easy to capture sessions either using the built in capture tool (big props to Eric Lawrence, Telerik and FiddlerCore which made that piece a snap), using the full version of Fiddler and exporting sessions, or by manually or programmatically creating text files based on plain HTTP headers to create requests. I’ve been using this tool for 4 months now on a regular basis on various projects as a reality check for performance and scalability and it’s worked extremely well for finding small performance issues. I also use it regularly as a simple URL tester, as it allows me to quickly enter a URL plus headers and content and test that URL and its results along with the ability to easily save one or more of those URLs. A few weeks back I made a walk through video that goes over most of the features of WebSurge in some detail: Note that the UI has slightly changed since then, so there are some UI improvements. Most notably the test results screen has been updated recently to a different layout and to provide more information about each URL in a session at a glance. The video and the main WebSurge site has a lot of info of basic operations. For the rest of this post I’ll talk about a few deeper aspects that may be of interest while also giving a glance at how WebSurge works. Session Capturing As you would expect, WebSurge works with Sessions of Urls that are played back under load. Here’s what the main Session View looks like: You can create session entries manually by individually adding URLs to test (on the Request tab on the right) and saving them, or you can capture output from Web Browsers, Windows Desktop applications that call services, your own applications using the built in Capture tool. With this tool you can capture anything HTTP -SSL requests and content from Web pages, AJAX calls, SOAP or REST services – again anything that uses Windows or .NET HTTP APIs. Behind the scenes the capture tool uses FiddlerCore so basically anything you can capture with Fiddler you can also capture with Web Surge Session capture tool. Alternately you can actually use Fiddler as well, and then export the captured Fiddler trace to a file, which can then be imported into WebSurge. This is a nice way to let somebody capture session without having to actually install WebSurge or for your customers to provide an exact playback scenario for a given set of URLs that cause a problem perhaps. Note that not all applications work with Fiddler’s proxy unless you configure a proxy. For example, .NET Web applications that make HTTP calls usually don’t show up in Fiddler by default. For those .NET applications you can explicitly override proxy settings to capture those requests to service calls. The capture tool also has handy optional filters that allow you to filter by domain, to help block out noise that you typically don’t want to include in your requests. For example, if your pages include links to CDNs, or Google Analytics or social links you typically don’t want to include those in your load test, so by capturing just from a specific domain you are guaranteed content from only that one domain. Additionally you can provide url filters in the configuration file – filters allow to provide filter strings that if contained in a url will cause requests to be ignored. Again this is useful if you don’t filter by domain but you want to filter out things like static image, css and script files etc. Often you’re not interested in the load characteristics of these static and usually cached resources as they just add noise to tests and often skew the overall url performance results. In my testing I tend to care only about my dynamic requests. SSL Captures require Fiddler Note, that in order to capture SSL requests you’ll have to install the Fiddler’s SSL certificate. The easiest way to do this is to install Fiddler and use its SSL configuration options to get the certificate into the local certificate store. There’s a document on the Telerik site that provides the exact steps to get SSL captures to work with Fiddler and therefore with WebSurge. Session Storage A group of URLs entered or captured make up a Session. Sessions can be saved and restored easily as they use a very simple text format that simply stored on disk. The format is slightly customized HTTP header traces separated by a separator line. The headers are standard HTTP headers except that the full URL instead of just the domain relative path is stored as part of the 1st HTTP header line for easier parsing. Because it’s just text and uses the same format that Fiddler uses for exports, it’s super easy to create Sessions by hand manually or under program control writing out to a simple text file. You can see what this format looks like in the Capture window figure above – the raw captured format is also what’s stored to disk and what WebSurge parses from. The only ‘custom’ part of these headers is that 1st line contains the full URL instead of the domain relative path and Host: header. The rest of each header are just plain standard HTTP headers with each individual URL isolated by a separator line. The format used here also uses what Fiddler produces for exports, so it’s easy to exchange or view data either in Fiddler or WebSurge. Urls can also be edited interactively so you can modify the headers easily as well: Again – it’s just plain HTTP headers so anything you can do with HTTP can be added here. Use it for single URL Testing Incidentally I’ve also found this form as an excellent way to test and replay individual URLs for simple non-load testing purposes. Because you can capture a single or many URLs and store them on disk, this also provides a nice HTTP playground where you can record URLs with their headers, and fire them one at a time or as a session and see results immediately. It’s actually an easy way for REST presentations and I find the simple UI flow actually easier than using Fiddler natively. Finally you can save one or more URLs as a session for later retrieval. I’m using this more and more for simple URL checks. Overriding Cookies and Domains Speaking of HTTP headers – you can also overwrite cookies used as part of the options. One thing that happens with modern Web applications is that you have session cookies in use for authorization. These cookies tend to expire at some point which would invalidate a test. Using the Options dialog you can actually override the cookie: which replaces the cookie for all requests with the cookie value specified here. You can capture a valid cookie from a manual HTTP request in your browser and then paste into the cookie field, to replace the existing Cookie with the new one that is now valid. Likewise you can easily replace the domain so if you captured urls on west-wind.com and now you want to test on localhost you can do that easily easily as well. You could even do something like capture on store.west-wind.com and then test on localhost/store which would also work. Running Load Tests Once you’ve created a Session you can specify the length of the test in seconds, and specify the number of simultaneous threads to run each session on. Sessions run through each of the URLs in the session sequentially by default. One option in the options list above is that you can also randomize the URLs so each thread runs requests in a different order. This avoids bunching up URLs initially when tests start as all threads run the same requests simultaneously which can sometimes skew the results of the first few minutes of a test. While sessions run some progress information is displayed: By default there’s a live view of requests displayed in a Console-like window. On the bottom of the window there’s a running total summary that displays where you’re at in the test, how many requests have been processed and what the requests per second count is currently for all requests. Note that for tests that run over a thousand requests a second it’s a good idea to turn off the console display. While the console display is nice to see that something is happening and also gives you slight idea what’s happening with actual requests, once a lot of requests are processed, this UI updating actually adds a lot of CPU overhead to the application which may cause the actual load generated to be reduced. If you are running a 1000 requests a second there’s not much to see anyway as requests roll by way too fast to see individual lines anyway. If you look on the options panel, there is a NoProgressEvents option that disables the console display. Note that the summary display is still updated approximately once a second so you can always tell that the test is still running. Test Results When the test is done you get a simple Results display: On the right you get an overall summary as well as breakdown by each URL in the session. Both success and failures are highlighted so it’s easy to see what’s breaking in your load test. The report can be printed or you can also open the HTML document in your default Web Browser for printing to PDF or saving the HTML document to disk. The list on the right shows you a partial list of the URLs that were fired so you can look in detail at the request and response data. The list can be filtered by success and failure requests. Each list is partial only (at the moment) and limited to a max of 1000 items in order to render reasonably quickly. Each item in the list can be clicked to see the full request and response data: This particularly useful for errors so you can quickly see and copy what request data was used and in the case of a GET request you can also just click the link to quickly jump to the page. For non-GET requests you can find the URL in the Session list, and use the context menu to Test the URL as configured including any HTTP content data to send. You get to see the full HTTP request and response as well as a link in the Request header to go visit the actual page. Not so useful for a POST as above, but definitely useful for GET requests. Finally you can also get a few charts. The most useful one is probably the Request per Second chart which can be accessed from the Charts menu or shortcut. Here’s what it looks like:   Results can also be exported to JSON, XML and HTML. Keep in mind that these files can get very large rather quickly though, so exports can end up taking a while to complete. Command Line Interface WebSurge runs with a small core load engine and this engine is plugged into the front end application I’ve shown so far. There’s also a command line interface available to run WebSurge from the Windows command prompt. Using the command line you can run tests for either an individual URL (similar to AB.exe for example) or a full Session file. By default when it runs WebSurgeCli shows progress every second showing total request count, failures and the requests per second for the entire test. A silent option can turn off this progress display and display only the results. The command line interface can be useful for build integration which allows checking for failures perhaps or hitting a specific requests per second count etc. It’s also nice to use this as quick and dirty URL test facility similar to the way you’d use Apache Bench (ab.exe). Unlike ab.exe though, WebSurgeCli supports SSL and makes it much easier to create multi-URL tests using either manual editing or the WebSurge UI. Current Status Currently West Wind WebSurge is still in Beta status. I’m still adding small new features and tweaking the UI in an attempt to make it as easy and self-explanatory as possible to run. Documentation for the UI and specialty features is also still a work in progress. I plan on open-sourcing this product, but it won’t be free. There’s a free version available that provides a limited number of threads and request URLs to run. A relatively low cost license  removes the thread and request limitations. Pricing info can be found on the Web site – there’s an introductory price which is $99 at the moment which I think is reasonable compared to most other for pay solutions out there that are exorbitant by comparison… The reason code is not available yet is – well, the UI portion of the app is a bit embarrassing in its current monolithic state. The UI started as a very simple interface originally that later got a lot more complex – yeah, that never happens, right? Unless there’s a lot of interest I don’t foresee re-writing the UI entirely (which would be ideal), but in the meantime at least some cleanup is required before I dare to publish it :-). The code will likely be released with version 1.0. I’m very interested in feedback. Do you think this could be useful to you and provide value over other tools you may or may not have used before? I hope so – it already has provided a ton of value for me and the work I do that made the development worthwhile at this point. You can leave a comment below, or for more extensive discussions you can post a message on the West Wind Message Board in the WebSurge section Microsoft MVPs and Insiders get a free License If you’re a Microsoft MVP or a Microsoft Insider you can get a full license for free. Send me a link to your current, official Microsoft profile and I’ll send you a not-for resale license. Send any messages to [email protected]. Resources For more info on WebSurge and to download it to try it out, use the following links. West Wind WebSurge Home Download West Wind WebSurge Getting Started with West Wind WebSurge Video© Rick Strahl, West Wind Technologies, 2005-2014Posted in ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Daily tech links for .net and related technologies - June 1-3, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - June 1-3, 2010 Web Development Anti-Forgery Request Recipes For ASP.NET MVC And AJAX - Dixin ASP.NET MVC 2 Localization Complete Guide - Alex Adamyan Dynamically Structured ViewModels in ASP.NET MVC - Keith Brown ASP.NET MVC Time Planner is available at CodePlex - Gunnar Peipman Part 2 – A Cascading Hierarchical Field Template & Filter for Dynamic Data - Steve SharePoint Server 2010 Enterprise Content Management Resources - Andrew Connell Web...(read more)

    Read the article

  • Daily tech links for .net and related technologies - May 10-12, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - May 10-12, 2010 Web Development jQuery Templates and Data Linking (and Microsoft contributing to jQuery) - ScottGu ASP.NET MVC and jQuery Part 4 – Advanced Model Binding - Mister James Creating an ASP.NET report using Visual Studio 2010 - Part 1 & Part 2 & Part 3 - rajbk Caching Images in ASP.NET MVC -Evan How to Localize an ASP.NET MVC Application - mikeceranski Localization in ASP.NET MVC 2 using ModelMetadata - Raj Kiamal Web Design...(read more)

    Read the article

  • Daily tech links for .net and related technologies - May 13-16, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - May 13-16, 2010 Web Development Integrating Twitter Into An ASP.NET Website Using OAuth - Scott Mitchell T4MVC Extensions for MVC Partials - Evan Building a Data Grid in ASP.NET MVC - Ali Bastani Introducing the MVC Music Store - MVC 2 Sample Application and Tutorial - Jon Galloway Announcing the RTM of MvcExtensions - kazimanzurrashid Optimizing Your Website For Speed Web Design Validation with the jQuery UI Tabs Widget - Chris Love A Brief History...(read more)

    Read the article

  • BackupExec errors when starting services

    - by blade
    Hi, I've installed backupexec 2010 trial on my server, with an appropriately-privileged AD account, but get errors when starting the required services from the login page: Processing services Start services on server: WIN-HQ7JSCRTTSQ Starting Enterprise Vault Admin Service on WIN-HQ7JSCRTTSQ. The service Enterprise Vault Admin Service is already running on WIN-HQ7JSCRTTSQ. Starting Backup Exec Remote Agent for Windows Systems on WIN-HQ7JSCRTTSQ. The service Backup Exec Remote Agent for Windows Systems is already running on WIN-HQ7JSCRTTSQ. Starting Backup Exec Device & Media Service on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Device & Media Service on WIN-HQ7JSCRTTSQ. Service-specific error code returned: 0x2000e2d3 (536928979) Starting Backup Exec Server on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Server on WIN-HQ7JSCRTTSQ. The dependency service or group failed to start. Starting Backup Exec Job Engine on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Job Engine on WIN-HQ7JSCRTTSQ. The dependency service or group failed to start. Starting Backup Exec Agent Browser on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Agent Browser on WIN-HQ7JSCRTTSQ. The dependency service or group failed to start. Starting Backup Exec DLO Administration Service on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec DLO Administration Service on WIN-HQ7JSCRTTSQ. Error code returned: Starting Backup Exec DLO Maintenance Service on WIN-HQ7JSCRTTSQ. The service Backup Exec DLO Maintenance Service is already running on WIN-HQ7JSCRTTSQ. Starting Backup Exec Web Service on WIN-HQ7JSCRTTSQ. The service Backup Exec Web Service is already running on WIN-HQ7JSCRTTSQ. Start services on server WIN-HQ7JSCRTTSQ completed. Processing services completed! How can I resolve this?

    Read the article

  • Windows Network Load Balancing on ESX Cluster with Dell PowerConnect stacks

    - by dunxd
    We recently switched out our Cisco 6500 core switch for a pair of Dell PowerConnect 6248 stacks. Since then, our Network Load Balanced Sharepoint, which runs on two virtual machines on an ESX cluster has been behaving very poorly. The symptoms are that opening and saving documents stored in sharepoint takes a very very long time. There are no errors showing up on the Sharepoint servers or SQL server, just a lot of annoyed users. Initially I thought there was no way NLB could cause this, but as soon as we repointed the DNS records for our intranet to the ip address of one of the web front ends, the problems disappeared. We suspect there is an issue related to multicast in the Dell configs - NLB is configured for multicast, but not IGMP. Has anyone got a similar set up to us and fixed this sort of issue? Sharepoint on VMware ESX, with Dell PowerConnect switches.

    Read the article

  • EC2 EBS AMI Instance stopping/restarting doesn't start services

    - by tgm
    I've recently been moving our instances to EBS instances (CentOS) and still have a bit of confusion on what's happening when I "stop" and instance. I have some of my services with runlevels 345 on but when I start a stopped instance the services don't start. What's actually happening when I issue a stop command to the instance, and how do I get my services to start automatically when I start the instance up again?

    Read the article

  • What would be the total expenses of running an in-home web server?

    - by techaddict
    I have several hosting accounts with companies, and plan to keep them. However, I would like to try my hand at creating my own home web-server, both for fun and for the learning experience. I would like to know the expenses involved, including: electricity costs (greater than light bulb?) internet costs (will I have to upgrade my internet? Or is 3-5Mbps upload speed fine for a web server with medium amount of traffic? Would I have to get a separate internet connection?) other unknown expenses Consider that I will configure the web server myself, so that is not an expense. Also consider that I already have a computer (year-old Dell laptop, 15R) to use to be dedicated as the web server.

    Read the article

  • Windows 2003 Server R2SP2 throws even ID 2269 after installing Excel WebPart in MOSS3

    - by Phil
    We recently added an Excel workbook webpart (read only excel file - no editing) on our Sharepoint 2007 server. Once we did that, approximately 3-4 times an hour event ID 2269 is shown in the Application Log and a few minutes after that, an event id 1002 is displayed in the system long and the Sharepoint Offfce Servers Application pool shuts down. We've already check the "Bypass traverse checking and DCOM settings) per the MS KB and I have opened a ticket with MS Support. Problem is that MS Suppoert (sharepoint) thinks it is an IIS problem and the IIS people think it is a Sharepoint issue. Anyone else seen this problem? If we remove the Excel webpart, everything goes back to normal. The App Pools, SP and SP Central Admin sites are all using the same domain service account. Thanks in advance, Phil

    Read the article

< Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >