Search Results

Search found 986 results on 40 pages for 'rick moss'.

Page 37/40 | < Previous Page | 33 34 35 36 37 38 39 40  | Next Page >

  • WCF service with Factory attribute on .svc is not working on web server (IIS6), but is locally using

    - by Jessica
    I am working on implementing a non web.config approach of WCF services using the factory attribute on the .svc file per Rick Strahl's blog post: Factory="System.ServiceModel.Activation.WebScriptServiceHostFactory" Locally, I am running IIS7 in Visual Studio 2008 and have no problem, but when I deploy to my web server (currently running IIS6), I am getting an authentication error in the event log: Exception: System.ServiceModel.ServiceActivationException: The service '/Services/ResourcesService.svc' cannot be activated due to an exception during compilation. The exception message is: IIS specified authentication schemes 'IntegratedWindowsAuthentication, Anonymous', but the binding only supports specification of exactly one authentication scheme. Valid authentication schemes are Digest, Negotiate, NTLM, Basic, or Anonymous. Change the IIS settings so that only a single authentication scheme is used.. --- System.InvalidOperationException: IIS specified authentication schemes 'IntegratedWindowsAuthentication, Anonymous', but the binding only supports specification of exactly one authentication scheme. Valid authentication schemes are Digest, Negotiate, NTLM, Basic, or Anonymous. Change the IIS settings so that only a single authentication scheme is used. at System.ServiceModel.Web.WebServiceHost.SetBindingCredentialBasedOnHostedEnvironment(ServiceEndpoint serviceEndpoint, AuthenticationSchemes supportedSchemes) at System.ServiceModel.Web.WebServiceHost.AddAutomaticWebHttpBindingEndpoints(ServiceHost host, IDictionary`2 implementedContracts, String multipleContractsErrorMessage) at System.ServiceModel.WebScriptServiceHost.OnOpening() at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout) at System.ServiceModel.Channels.CommunicationObject.Open() at System.ServiceModel.ServiceHostingEnvironment.HostingManager.ActivateService(String normalizedVirtualPath) at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath) After doing some Googling, I changed my authentication settings on the .svc folder within my project (on the server) to only anonymous authentication, but it did not work. I still get web service failed on the calls. IIS7 by default only had anonymous. I do not have any entries in my web.config for the services (I stripped them out per this pattern). I am using a nant script to deploy the website to the server and use this also locally to verify the script was not causing the issue. Any known issue with this? IIS 6 not able to handle?

    Read the article

  • MVC controller is being called twice

    - by rboarman
    Hello, I have a controller that is being called twice from an ActionLink call. My home page has a link, that when clicked calls the Index method on the Play controller. An id of 100 is passed into the method. I think this is what is causing the issue. More on this below. Here are some code snippets: Home page: <%= Html.ActionLink(“Click Me”, "Index", "Play", new { id = 100 }, null) %> Play Controller: public ActionResult Index(int? id) { var settings = new Dictionary<string, string>(); settings.Add("Id", id.ToString()); ViewData["InitParams"] = settings.ToInitParams(); return View(); } Play view: <%@ Page Title="" Language="C#" Inherits="System.Web.Mvc.ViewPage" %> (html <head> omitted for brevity) <body> <form id="form1" runat="server" style="height:100%"> Hello </form> </body> If I get rid of the parameter to the Index method, everything is fine. If I leave the parameter in place, then the Index method is called with 100 as the id. After returning the View, the method is called a second time with a parameter of null. I can’t seem to figure out what is triggering the second call. My first thought was to add a specific route like this: routes.MapRoute( "Play", // Route name "Play/{id}", // URL with parameters new {controller = "Play", action = "Index"} // Parameter defaults ); This had no effect other than making a prettier looking link. I am not sure where to go from here. Thank you in advance. Rick

    Read the article

  • Fluent config not generating mapping files

    - by rboarman
    Hello, I am trying to get Fluent nHibernate to generate mappings so I can take a look at the files and the sql. My code is based on this post and on what I can glean from the documentation. http://stackoverflow.com/questions/1375146/fluent-mapping-entities-and-classmaps-in-different-assemblies I am using the latest code from git. Here’s my config code: Configuration cfg = new Configuration(); var ft = Fluently.Configure(cfg); //DbConnection by fluent ft.Database ( MsSqlConfiguration .MsSql2008 .ConnectionString("……") .ShowSql() .UseReflectionOptimizer() ); //get mapping files. ft.Mappings(m => { //set up the mapping locations m.FluentMappings.AddFromAssemblyOf<Entity>() .ExportTo(@"C:\temp"); m.Apply(cfg); }); I also tried: var sessionFactory = Fluently.Configure() .Database(MsSqlConfiguration .MsSql2008 .ShowSql() .ConnectionString(“……")) .Mappings(p => p.FluentMappings .AddFromAssemblyOf<Entity>() .ExportTo(@"c:\temp\")) .BuildSessionFactory(); I have verified that the connection string is correct. The issue is that no mapping files show up in the ExportTo folder and no sql code shows up in the output window or in the log file. No errors or exceptions are generated either. I have no idea where to go from here. Thank you in advance. Rick

    Read the article

  • When *not* to use prepared statements?

    - by Ben Blank
    I'm re-engineering a PHP-driven web site which uses a minimal database. The original version used "pseudo-prepared-statements" (PHP functions which did quoting and parameter replacement) to prevent injection attacks and to separate database logic from page logic. It seemed natural to replace these ad-hoc functions with an object which uses PDO and real prepared statements, but after doing my reading on them, I'm not so sure. PDO still seems like a great idea, but one of the primary selling points of prepared statements is being able to reuse them… which I never will. Here's my setup: The statements are all trivially simple. Most are in the form SELECT foo,bar FROM baz WHERE quux = ? ORDER BY bar LIMIT 1. The most complex statement in the lot is simply three such selects joined together with UNION ALLs. Each page hit executes at most one statement and executes it only once. I'm in a hosted environment and therefore leery of slamming their servers by doing any "stress tests" personally. Given that using prepared statements will, at minimum, double the number of database round-trips I'm making, am I better off avoiding them? Can I use PDO::MYSQL_ATTR_DIRECT_QUERY to avoid the overhead of multiple database trips while retaining the benefit of parametrization and injection defense? Or do the binary calls used by the prepared statement API perform well enough compared to executing non-prepared queries that I shouldn't worry about it? EDIT: Thanks for all the good advice, folks. This is one where I wish I could mark more than one answer as "accepted" — lots of different perspectives. Ultimately, though, I have to give rick his due… without his answer I would have blissfully gone off and done the completely Wrong Thing even after following everyone's advice. :-) Emulated prepared statements it is!

    Read the article

  • Linq join two dictionaries using a common key

    - by rboarman
    Hello, I am trying to join two Dictionary collections together based on a common lookup value. var idList = new Dictionary<int, int>(); idList.Add(1, 1); idList.Add(3, 3); idList.Add(5, 5); var lookupList = new Dictionary<int, int>(); lookupList.Add(1, 1000); lookupList.Add(2, 1001); lookupList.Add(3, 1002); lookupList.Add(4, 1003); lookupList.Add(5, 1004); lookupList.Add(6, 1005); lookupList.Add(7, 1006); // Something like this: var q = from id in idList.Keys join entry in lookupList on entry.Key equals id select entry.Value; The Linq statement above is only an example and does not compile. For each entry in the idList, pull the value from the lookupList based on matching Keys. The result should be a list of Values from lookupList (1000, 1002, 1004). What’s the easiest way to do this using Linq? Thank you, Rick

    Read the article

  • Getting identity from Ado.Net Update command

    - by rboarman
    My scenario is simple. I am trying to persist a DataSet and have the identity column filled in so I can add child records. Here's what I've got so far: using (SqlConnection connection = new SqlConnection(connStr)) { SqlDataAdapter adapter = new SqlDataAdapter("select * from assets where 0 = 1", connection); adapter.MissingMappingAction = MissingMappingAction.Passthrough; adapter.MissingSchemaAction = MissingSchemaAction.AddWithKey; SqlCommandBuilder cb = new SqlCommandBuilder(adapter); var insertCmd = cb.GetInsertCommand(true); insertCmd.Connection = connection; connection.Open(); adapter.InsertCommand = insertCmd; adapter.InsertCommand.CommandText += "; set ? = SCOPE_IDENTITY()"; adapter.InsertCommand.UpdatedRowSource = UpdateRowSource.OutputParameters; var param = new SqlParameter("RowId", SqlDbType.Int); param.SourceColumn = "RowId"; param.Direction = ParameterDirection.Output; adapter.InsertCommand.Parameters.Add(param); SqlTransaction transaction = connection.BeginTransaction(); insertCmd.Transaction = transaction; try { assetsImported = adapter.Update(dataSet.Tables["Assets"]); transaction.Commit(); } catch (Exception ex) { transaction.Rollback(); // Log an error } connection.Close(); } The first thing that I noticed, besides the fact that the identity value is not making its way back into the DataSet, is that my change to add the scope_identity select statement to the insert command is not being executed. Looking at the query using Profiler, I do not see my addition to the insert command. Questions: 1) Why is my addition to the insert command not making its way to the sql being executed on the database? 2) Is there a simpler way to have my DataSet refreshed with the identity values of the inserted rows? 3) Should I use the OnRowUpdated callback to add my child records? My plan was to loop through the rows after the Update() call and add children as needed. Thank you in advance. Rick

    Read the article

  • IIS6 compressing static files, but not JSON requests

    - by user500038
    I have IIS6 compression setup, and static content is being compressed correctly as per Coding Horror. Example of a static file: Response Headers Content-Length 55513 Content-Type application/x-javascript Content-Encoding gzip Last-Modified Mon, 20 Dec 2010 15:31:58 GMT Accept-Ranges bytes Vary Accept-Encoding Server Microsoft-IIS/6.0 X-Powered-By ASP.NET Date Wed, 29 Dec 2010 16:37:23 GMT Request Headers Accept */* Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive As you can see, the response is correctly compressed. However with a JSON call you'll see the request with the gzip parameter correctly set: Accept application/json, text/javascript, */*; q=0.01 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive X-Requested-With XMLHttpRequest Then for the response: Cache-Control private Content-Length 6811 Content-Type application/json; charset=utf-8 Server Microsoft-IIS/6.0 X-Powered-By ASP.NET X-AspNet-Version 2.0.50727 X-AspNetMvc-Version 2.0 No gzip! I found this article by Rick Stahl, which outlines how to compress in your code, but I'd like IIS to handle this. Is this possible with IIS6?

    Read the article

  • Parsing the Youtube API with DOM

    - by Kirk
    I'm using the Youtube API and I can retrieve the date information without a problem, but don't know how to retrieve the description information. My Code: <?php $v = "dQw4w9WgXcQ"; $url = "http://gdata.youtube.com/feeds/api/videos/". $v; $doc = new DOMDocument; $doc->load($url); $pub = $doc->getElementsByTagName("published")->item(0)->nodeValue; $desc = $doc->getElementsByTagName("media:description")->item(0)->nodeValue; echo "<b>Video Uploaded:</b> "; echo date( "F jS, Y", strtotime( $pub ) ); echo '<br>'; if (isset ($desc)) { echo "<b>Description:</b> "; echo $desc; echo '<br>'; } ?> Here's a link to the feed: http://gdata.youtube.com/feeds/api/videos/dQw4w9WgXcQ?prettyprint=true And the excerpt of code I don't know how to retrieve data from: <media:group> <media:description type='plain'>Music video by Rick Astley performing Never Gonna Give You Up. (C) 1987 PWL</media:description> </media:group> Thanks in advance.

    Read the article

  • "The specified view is invalid" in call to LimitedWebPartManager.AddWebPart in SharePoint 2010

    - by Lee Richardson
    This code used to work in WSS 3.0 / MOSS 2007 in FeatureReceiver.FeatureActivated: using (SPLimitedWebPartManager limitedWebPartManager = Site.GetLimitedWebPartManager("default.aspx", PersonalizationScope.Shared)) { ListViewWebPart listViewWebPart = new ListViewWebPart { Title = title, ListName = list.ID.ToString("B").ToUpper(), ViewGuid = view.ID.ToString("B").ToUpper() }; limitedWebPartManager.AddWebPart(listViewWebPart, zone, position); } I'm trying to convert to SharePoint 2010 and it now fails with: System.ArgumentException: The specified view is invalid. at Microsoft.SharePoint.SPViewCollection.get_Item(Guid guid) at Microsoft.SharePoint.WebPartPages.ListViewWebPart.EnsureListAndView(Boolean requireFullBlownViewSchema) at Microsoft.SharePoint.WebPartPages.ListViewWebPart.get_AppropriateBaseViewId() at Microsoft.SharePoint.WebPartPages.SPWebPartManager.AddWebPartInternal(SPSupersetWebPart superset, Boolean throwIfLocked) at Microsoft.SharePoint.WebPartPages.SPLimitedWebPartManager.AddWebPartInternal(WebPart webPart, String zoneId, Int32 zoneIndex, Boolean throwIfLocked) at Microsoft.SharePoint.WebPartPages.SPLimitedWebPartManager.AddWebPart(WebPart webPart, String zoneId, Int32 zoneIndex) Interestingly enough when I run it from a unit test it works, it only fails in FeatureActivated. When I debug with Reflector it is failing on this line: this.view = this.list.LightweightViews[new Guid(this.ViewGuid)]; list.LightweightViews only returns one view, the default view, even though list.Views returns all of them. I have no idea what LightweightViews is supposed to mean and I'm running out of ideas. Anyone else got any?

    Read the article

  • SSRS for Sharepoint, Images in a report from a Sharepoint List URL?

    - by James Polhemus
    Greetings Sabios, I have several reports I run successfully where the data comes from a Sharepoint list in the form of an XML dataset. I am however having trouble with one. I have a report that pulls an image file onto the main body of the report. This data too comes from a Sharepoint list in the form of an XML dataset which sends me the URL to the jpeg or bmp or gif... whatever the case may be. I can successfully pull this off in my own Visual Studio IDE. My Local Report Server will render it as well It won't run on my Sharepoint Report Server (My MOSS runs through https while my Shartpoint Report Server is http might this matter?) When I upload it to Sharepoint and run it through the Sharepoint Report Server, I get back EVERYTHING in the report Header and Footer (dataset text and embedded Images) but just a big RED X where the Main Image should be. I have done everything the boards say: A. I made sure the Unattended Execution Account is running on the Reports Server B. I have insured the URL comes back in clean format (else the images wouldn't render locally either and they do) The report logs throw this exception: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.ContainerTypeNotSupportedException: The target location you specified is not supported by the report server. A report definition (.rdl), report model (.smdl), resource, or shared data source (.rsds) file must be located within a library or a folder within it., ; Info: Microsoft.ReportingServices.Diagnostics.Utilities.ContainerTypeNotSupportedException: The target location you specified is not supported by the report server. A report definition (.rdl), report model (.smdl), resource, or shared data source (.rsds) file must be located within a library or a folder within it. Any takers? Even my Sharepoint Administrator can't help me:) James

    Read the article

  • Which parts of Sharepoint do I need to understand to build a publicly facing website?

    - by Petras
    I am building a publicly facing website that does the following. Users log in. And then view a list of their customers. They click on a customer to view their past purchases, order them, change them etc. This is not a shopping site by the way. It is a simple look up tool. Note that none of the data accessed by the website is in anything other than a SQL database - no office documents. Also, the login does not use users Windows credentials on a VPN or something like that. Typically I would build this using a standard ASP.NET MVC website. However the client says they want to use Sharepoint. As I understand it, Sharepoint is used for workflow and websites that are collaboration tools such as the components you can see here http://www.sharepointhosting.com/sharepoint-features.html Here are my questions: Would I be right in saying that WSS is completely inappropriate for this task as it comes with an overhead that provides no benefits? If I had to use it, would I need WSS or MOSS? If I had to use it, would I be right in saying the site would consist of : List item a) Web Parts b) And a custom site layout. How do I create one of these?

    Read the article

  • Investigating the root cause behind SharePoint's "request not found in the TrackedRequests"

    - by Muhimbi
    We have a long standing issue in our bug tracking system about the dreaded "ERROR: request not found in the TrackedRequests. We might be creating and closing webs on different threads." message in SharePoint's trace log. As we develop Workflow software for the SharePoint market, we look into this issue from time to time to make sure it is not caused by our products. I have personally come to the conclusion that this is a problem in SharePoint, but perhaps someone else can prove me wrong. Here is what I know: According to the hundreds of search results returned by Google on this topic, this issue appears to be mainly related to SharePoint Workflows, both SharePoint Designer and Visual Studio based workflows. Assuming ULS logging is set to Monitorable, the easiest way to reproduce this problem is to create a new SharePoint Designer Workflow, attach it to a document library, set it to auto start on add/update, don't add any actions, save the workflow and upload a file to the document library. The error is only visible in the SharePoint trace log, it does not appear to impact the execution of the workflow at hand. I have verified that the problem occurs on 32 bit as well as 64 bit systems, Win2K3 and 2K8, WSS and MOSS with SharePoint versions up to the December 2009 Cumulative Update (6524). The problem does not occur when a workflow is started manually. There are dozens of related posts on MSDN Forums, hundreds on Google, one on StackOverflow and none on SharePoint Overflow. There appears to be no answer. Does anyone have any idea about what is going on, what is causing this and if we should worry or file this under 'Red Herrings'.

    Read the article

  • Should we have a database independent SQL like query language in Django? [closed]

    - by Yugal Jindle
    Note : I know we have Django ORM already that keeps things database independent and converts to the database specific SQL queries. Once things starts getting complicated it is preferred to write raw SQL queries for better efficiency. When you write raw sql queries your code gets trapped with the database you are using. I also understand its important to use the full power of your database that can-not be achieved with the django orm alone. My Question : Until I use any database specific feature, why should one be trapped with the database. For instance : We have a query with multiple joins and we decided to write a raw sql query. Now, that makes my website postgres specific. Even when I have not used any postgres specific feature. I feel there should be some fake sql language which can translate to any database's sql query. Even Django's ORM can be built over it. So, that if you go out of ORM but not database specific - you can still remain database independent. I asked the same question to Jacob Kaplan Moss (In person) : He advised me to stay with the database that I like and endure its whole power, to which I agree. But my point was not that we should be database independent. My point is we should be database independent until we use a database specific feature. Please explain, why should be there a fake sql layer over the actual sql ?

    Read the article

  • Authenticated user cannot log in, "The user does not exist or is not unique."

    - by Aquinas
    This is a weird one. I have a WSS3 site, no MOSS, with a custom membership and role provider that authenticates against CRM. All the users have also been added to the site user list so once logged in they have correct display names. On dev and stage everything works fine, but on UAT the users can't get past the login screen. The login screen is working, in that if you type an incorrect password for a user it comes back with the right message, meaning the custom provider is working fine. If you fill the login form in correctly you are immediately redirected straight back to the login screen, with the IIS logs showing that the login screen sent the authenticated user to the site and then was sent back. Setting the site to allow anonymous access shows that the user is not logged in on the site side after authenticating correctly. The ULS logs show: The user does not exist or is not unique. Found 1 trusted forests nzct.local. Found 0 trusted domains Adding logging code to the site I have verified that the membership provider is correctly set, and can find the user when asked. Also, when accessing the site user list, I can find the SP user with the right name. It just refuses to set the current user to be the authenticated user. Weird.

    Read the article

  • How can I tell whether a webpart that has been deployed to a site is a native webpart that ships wit

    - by program247365
    I have a SharePoint 2007 MOSS instance, and I'm on a fact-finding mission. There have been multiple developers, developing multiple webparts and deploying them (using VS2005/2008 SharePoint Extensions). I thought maybe I could look at the fields in the "Web Part Gallery" list in my site, and look by "Modified by", but it looks like a developer's name is on some of the out-of-the-box webparts somehow, and on ones I know are custom developed, they say "System Account" - so looking at that field in this list is a no go. I thought then maybe I could look at the "Group" to which each webpart was assigned but it looks like they were arbitrarily assigned to many different groups inconsistently - so using that piece of information is a no go. Here is my code I have for just looping through and getting the names of all the webparts. Is there any property I can access on the list items of webparts that would tell me whether it's a custom developed webpart? Any way to distinguish the custom webparts from the out-of-the-box ones? Is there another way to do this? #region Misc Site Collection Methods public static List<string> GetAllWebParts(string connectedSPInstanceUrl) { List<string> lstWebParts = new List<string>(); try { using (SPSite site = new SPSite(connectedSPInstanceUrl)) { using (SPWeb web = site.OpenWeb()) { SPList list = web.Lists["Web Part Gallery"]; foreach (SPListItem item in list.Items) { lstWebParts.Add(item.Name); } } } } catch (Exception ex) { lstWebParts.Add("Error"); lstWebParts.Add("Message: " + ex.Message); lstWebParts.Add("Inner Exception: " + ex.InnerException.ToString()); lstWebParts.Add("Stack Trace: " + ex.StackTrace); } return lstWebParts; } #endregion

    Read the article

  • How can I share an entity framework model across website users

    - by richardmoss
    Hello, Currently my website is based around MVC and the Entity Framework running against a SQL Server 2005 database. So far, it has all been running very smoothly, and I really enjoy MVC and its slimmer more concise code (and no huge viewstates or soul destroying postbacks ;)) Recently I was working on upgrading the site to use a simple forum system, and this is where I started running into problems. When I was testing the site using two different browsers, if I created or replied to a post in one browser, the other browser couldn't see the post. At the moment, each visitor to the site gets their own copy of the entity model, which I store in their session data. Obviously this is the problem as updates to one model aren't getting carried to the other. As a test, I tried storing a single copy of the model which all visitors would access by assigning the model to a static variable. This worked, and both browsers could see each others modifications. However, it had its side effects. For example, if I fired up both browsers at the same time and the model was initialized, one browser would crash, and the other would work fine, despite me using a locking object so in theory one of them should have been delayed until the model was ready (of course I could have implemented this wrong ;)). Also, originally this site did use one model for all visitors and when it was live, it frequently shut down - killing the IIS application pool while it did. Now I'm not sure if this was related, but I don't really want to reintroduce whatever bug I had that caused this shut down. So, my question is a simple one really - what is the best way of either using the same model for all website users so they all see updates, or if they do have separate copies (which I imagine will have a performance impact in time) how can the models detect changes in the database and update themselves according. Thanks in advance for any advice! Regards; Richard Moss

    Read the article

  • Approach For Syncing One SharePoint List With One or More SharePoint Lists

    - by plattnum
    What would be the best approach or strategy for configuring, customizing or developing in SharePoint a solution that allows me to keep one or more SharePoint lists in sync with a SharePoint list I have designated as a master or parent list. I would like to be able to create a master/parent list of some information that can be extended or used by different parts of the organization without them being able to CRUD any items on the actual columns of the master list. (I have seen some commercial web parts that offer column security on SharePoint lists and although that’s one way of potentially meeting my needs I would like to explore other options.) Scenario: I have a list called FOO: FOO Title Description I would like to create a new list BAR based off of FOO (BAR is managed by sub-organization that doesn't have access to FOO List): BAR FOO.Title (Read-Only) FOO.Description (Read-Only) NewColumn1 NewColumn2 Actions: Create- If a new item is entered in FOO I would like the new item added to BAR. Read - N/A Update - If the title or description is changed in FOO I would like it changed in BAR. Delete- No Deletes in the scenario. (Deletes are handled by the business with status column.) Templates with content extraction offer me this but it’s a one time shot at list creation. Just not sure what the best approach or strategy would be for this in MOSS 2007. Thanks!

    Read the article

  • Dec 5th Links: ASP.NET, ASP.NET MVC, jQuery, Silverlight, Visual Studio

    - by ScottGu
    Here is the latest in my link-listing series.  Also check out my VS 2010 and .NET 4 series for another on-going blog series I’m working on. [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] ASP.NET ASP.NET Code Samples Collection: J.D. Meier has a great post that provides a detailed round-up of ASP.NET code samples and tutorials from a wide variety of sources.  Lots of useful pointers. Slash your ASP.NET compile/load time without any hard work: Nice article that details a bunch of optimizations you can make to speed up ASP.NET project load and compile times. You might also want to read my previous blog post on this topic here. 10 Essential Tools for Building ASP.NET Websites: Great article by Stephen Walther on 10 great (and free) tools that enable you to more easily build great ASP.NET Websites.  Highly recommended reading. Optimize Images using the ASP.NET Sprite and Image Optimization Framework: A nice article by 4GuysFromRolla that discusses how to use the open-source ASP.NET Sprite and Image Optimization Framework (one of the tools recommended by Stephen in the previous article).  You can use this to significantly improve the load-time of your pages on the client. Formatting Dates, Times and Numbers in ASP.NET: Scott Mitchell has a great article that discusses formatting dates, times and numbers in ASP.NET.  A very useful link to bookmark.  Also check out James Michael’s DateTime is Packed with Goodies blog post for other DateTime tips. Examining ASP.NET’s Membership, Roles and Profile APIs (Part 18): Everything you could possibly want to known about ASP.NET’s built-in Membership, Roles and Profile APIs must surely be in this tutorial series. Part 18 covers how to store additional user info with Membership. ASP.NET with jQuery An Introduction to jQuery Templates: Stephen Walther has written an outstanding introduction and tutorial on the new jQuery Template plugin that the ASP.NET team has contributed to the jQuery project. Composition with jQuery Templates and jQuery Templates, Composite Rendering, and Remote Loading: Dave Ward has written two nice posts that talk about composition scenarios with jQuery Templates and some cool scenarios you can enable with them. Using jQuery and ASP.NET to Build a News Ticker: Scott Mitchell has a nice tutorial that demonstrates how to build a dynamically updated “news ticker” style UI with ASP.NET and jQuery. Checking All Checkboxes in a GridView using jQuery: Scott Mitchell has a nice post that covers how to use jQuery to enable a checkbox within a GridView’s header to automatically check/uncheck all checkboxes contained within rows of it. Using jQuery to POST Form Data to an ASP.NET AJAX Web Service: Rick Strahl has a nice post that discusses how to capture form variables and post them to an ASP.NET AJAX Web Service (.asmx). ASP.NET MVC ASP.NET MVC Diagnostics Using NuGet: Phil Haack has a nice post that demonstrates how to easily install a diagnostics page (using NuGet) that can help identify and diagnose common configuration issues within your apps. ASP.NET MVC 3 JsonValueProviderFactory: James Hughes has a nice post that discusses how to take advantage of the new JsonValueProviderFactory support built into ASP.NET MVC 3.  This makes it easy to post JSON payloads to MVC action methods. Practical jQuery Mobile with ASP.NET MVC: James Hughes has another nice post that discusses how to use the new jQuery Mobile library with ASP.NET MVC to build great mobile web applications. Credit Card Validator for ASP.NET MVC 3: Benjii Me has a nice post that demonstrates how to build a [CreditCard] validator attribute that can be used to easily validate credit card numbers are in the correct format with ASP.NET MVC. Silverlight Silverlight FireStarter Keynote and Sessions: A great blog post from John Papa that contains pointers and descriptions of all the great Silverlight content we published last week at the Silverlight FireStarter.  You can watch all of the talks online.  More details on my keynote and Silverlight 5 announcements can be found here. 31 Days of Windows Phone 7: 31 great tutorials on how to build Windows Phone 7 applications (using Silverlight).  Silverlight for Windows Phone Toolkit Update: David Anson has a nice post that discusses some of the additional controls provided with the Silverlight for Windows Phone Toolkit. Visual Studio JavaScript Editor Extensions: A nice (and free) Visual Studio plugin built by the web tools team that significantly improves the JavaScript intellisense support within Visual Studio. HTML5 Intellisense for Visual Studio: Gil has a blog post that discusses a new extension my team has posted to the Visual Studio Extension Gallery that adds HTML5 schema support to Visual Studio 2008 and 2010. Team Build + Web Deployment + Web Deploy + VS 2010 = Goodness: Visual blogs about how to enable a continuous deployment system with VS 2010, TFS 2010 and the Microsoft Web Deploy framework.  Visual Studio 2010 Emacs Emulation Extension and VIM Emulation Extension: Check out these two extensions if you are fond of Emacs and VIM key bindings and want to enable them within Visual Studio 2010. Hope this helps, Scott

    Read the article

  • The Winds of Change are a Blowin&rsquo;

    - by Ajarn Mark Caldwell
    For six years I have been an avid and outspoken fan and paying customer of SourceGear products…from Vault to Dragnet to Fortress and on to Vault Professional, but that is all changing now.  Not the fan part, but the paying customer part.  I’m still a huge fan.  I think that SourceGear does a great job with their product and support has been fantastic when needed (which is not very often).  I think that Eric Sink has done a fine job building a quality company and products, and I appreciate his contributions to the tech community through this blogging and books.  I still think their products are high quality and do a fantastic job of what they do.  But there’s the rub…what they do is no longer enough for me. As I have rebuilt our development team over the last couple of years, and we have begun to investigate Scrum and Kanban, I realize that I need more visibility into the progress of the team.  I need better project management tools, and this is where Vault Professional lags behind several other tools.  Granted, in the latest release (Vault 6.0) they added a nice time tracking feature, but I want more.  (Note, I did contact SourceGear about my quest for more, but apparently, the rest of their customer base has not been clamoring for this and so they have not built it.  Granted, I wasn’t clamoring for it either until just recently, but unfortunately for SourceGear, I want it now and don’t want to wait for them to build it into their system.) Ironically, it was SourceGear themselves who started to turn me on to the possibilities of other tools.  They built a limited integration with Axosoft OnTime which I read about several times on their support site (I used to regularly read and occasionally comment on their Support Forum).  I decided to check out OnTime and was very impressed with the tool for work item tracking and project management (not to mention their great Scrum Master in 10 Minutes video).  I fell in love with the capabilities of OnTime.  Unfortunately, the integration with Vault for source control management was, as I mentioned, limited.  I could have forfeited the integration between work items and source code, but there is too much benefit to linking check-ins to work items for me to give that up.  So then I did what was previously unthinkable for me, I considered switching not just the work tracking tool, but also the source code management tool.  This was really stepping outside my comfort zone because source code is Gold, and not to be trifled with.  When you find a good weapon to protect your gold, stick with it. I looked at Git and Tortoise SVN, but the integration methods for those was pretty rough compared to what I was used to.  The recommended tool from Axosoft’s point of view appeared to be RocketSVN, but I really wasn’t sure I wanted to go the “flavor of Subversion” route.  Then I started thinking about that other tool I liked back when I first chose to go with Vault, but couldn’t afford:  Team Foundation Server.  And what do you know…Microsoft has not only radically improved it over that version from back in 2006, but they also came to their senses about how it should be licensed, and it is much more affordable now.  So I started looking into the latest capabilities in the 2012 version, and I fell in love all over again. I really went deep on checking out the tools.  I watched numerous webcasts from Microsoft partners, went to a beta preview on Microsoft’s campus, and watched a lot of Channel 9 videos on the new ALM features (oooh…shiny).  Frankly, I was very impressed with the capabilities of the newest version, and figured this was probably our direction.  As an interesting twist of fate, one of my employees crossed paths with an ALM Consultant from Northwest Cadence, a local Microsoft Partner, and one of the companies that produced several of the webcasts that I had been watching.  So I gave Bryon a call and started grilling him to see if he really knew anything or was just another guy who couldn’t find a job so he called himself a consultant.  It turns out Bryon actually knows a lot, especially in an area that was becoming a frustration point for us: Branching strategies and automated builds (that’s probably a whole separate blog entry).  As we talked, Bryon suggested we look into doing a DTDPS (Developer Tools Deployment Planning Services) session with his company.  This is a service that can be paid for by Microsoft Enterprise Agreement planning services credits or SA training benefits, and, again, coincidentally, we had several that were just about to expire, so I put them to good use. The DTDPS sessions were great; and Bryon, Rick, and the rest of the folks at Northwest Cadence have been a pleasure to work with.  We have just purchased a new server for our TFS rollout and are planning the steps and options right now.  This is still a big project ahead of us to not only install and configure TFS, but also to load all of our source code (many different systems, not just one program) and transition to the new way of life with TFS, but I am convinced that it is the right move for my team at this point in time.  We need the new capabilities that are in alignment with Scrum and Kanban methodologies in order to more efficiently manage all the different projects that we have going on at one time. I would still wholeheartedly endorse SourceGear’s products and Axosoft’s OnTime for those whose needs are met by those tools, but for me and my team, I think that TFS is the right fit, and I am looking forward to the change.

    Read the article

  • Back Up to Tape the Way You Shop For Groceries

    - by rickramsey
    Imagine if this was how you shopped for groceries: From the end of the aisle sprint to the point where you reach the ketchup. Pull a bottle from the shelf and yell at the top of your lungs, “Got it!” Sprint back to the end of the aisle. Start again and sprint down the same aisle to the mustard, pull a bottle from the shelf and again yell for the whole store to hear, “Got it!” Sprint back to the end of the aisle. Repeat this procedure for every item you need in the aisle. Proceed to the next aisle and follow the same steps for the list of items you need from that aisle. Sounds ridiculous, doesn’t it? Not only is it horribly inefficient, it’s exhausting and can lead to wear out failures on your grocery cart, or worse, yourself. This is essentially how NetApp and some other applications write NDMP backups to tape. In the analogy, the ketchup and mustard are the files to be written, yelling “Got it!” is the equivalent of a sync mark at the end of a file, and the sprint back to the end of an aisle is the process most commonly called a “backhitch” where the drive has to back up on a tape to start writing again. Writing to tape in this way results in very slow tape drive performance and imposes unnecessary wear on the tape drive and the media, especially when writing small files. The good news is not all tape drives behave this way when writing small files. Unlike midrange LTO drives, Oracle’s StorageTek T10000D tape drive is designed to handle this scenario efficiently. The difference between the two drive types is that the T10000D drive gives you the ability to write files in a NetApp NDMP backup environment the way you would normally shop for groceries. With grocery shopping, you essentially stream through aisles picking up items as you go, and then after checking out, yell, “Got it!”, though you might do that last step silently. With the T10000D, it has a feature called the Tape Application Accelerator, which prevents the drive from having to stop after each file is written to notify NetApp or another application that the write was successful. When enabled in the T10000D tape drive, Tape Application Accelerator causes the tape drive to respond to tape mark and file sync commands differently than when disabled: A tape mark received by the tape drive is treated as a buffered tape mark. A file sync received by the tape drive is treated as a no op command. Since buffered tape marks and no op commands do not cause the tape drive to empty the contents of its buffer to tape and backhitch, the data is written to tape in significantly less time. Oracle has emulated NetApp environments with a number of different file sizes and found the following when comparing the T10000D with the Tape Application Accelerator enabled versus LTO6 tape drives. Notice how the T10000D is not only monumentally faster, but also remarkably consistent? In addition, the writing of the 50 GB of files is done without a single backhitch. The LTO6 drive, meanwhile, will perform as many as 3,800 backhitches! At the end of writing the entire set of files, the T10000D tape drive reports back to the application, in this case NetApp, that the write was successful via a tape mark. So if the Tape Application Accelerator dramatically improves performance and reliability, why wouldn’t you always have it enabled? The reason is because tape drive buffers are meant to be just temporary data repositories so in the event of a power loss, there could be data loss in certain environments for the files that resided in the buffer. Fortunately, we do have best practices depending on your environment to avoid this from happening. I highly recommend reading Maximizing Tape Performance with StorageTek T10000 Tape Drives (pdf) to decide which best practice is right for you. The white paper also digs deeper into the benefits of the Tape Application Accelerator. The white paper is free, and after downloading it you can decide for yourself whether you want to yell “Got it!” out loud or just silently to yourself. Customer Advisory Panel One final link: Oracle has started up a Customer Advisory Panel program to collect feedback from customers on their current experiences with Oracle products, as well as desires for future product development. If you would like to participate in the program, go to this link at oracle.com. photo taken on Idaho's Sacajewea Historic Biway by Rick Ramsey - Brian Zents Follow OTN on Blog | Facebook | Twitter | YouTube

    Read the article

  • Stir Trek: Iron Man Edition Recap and Photos

    - by Brian Jackett
    If you’ve noticed my blogging activity has reduced in frequency and technical content lately it’s primarily due to all of the conferences I’ve been attending, speaking at, or planning in the past few months.  This past Friday myself and six other dedicated individuals put on Stir Trek: Iron Man Edition as the culmination of a few months of hard work.  For those unfamiliar, Stir Trek is a web developer conference that was founded last year as an event to showcase content from Microsoft’s MIX conference and end the day with a private showing of the then just-released Star Trek movie.  This year’s conference expanded from 2 to 4 content tracks and upped the number of tickets from 350 to 600.  Even more amazing was the fact that we had 592 people show up day of the event for the lowest drop-off percentage of any conference I’ve been to before.   Nerd Dinner and Swag Bags     The night before Stir Trek: Iron Man Edition we hosted a nerd dinner at the Polaris Shopping mall food court with about 30 in attendance.  Nerd dinners are a great time to meet others passionate about technology and socialize before the whirlwind of the conference hits.  After the nerd dinner 20+ volunteers headed to the conference location and helped us stuff swag bags.  This in and of itself was a monumental task of putting together 600 swag bags with numerous leaflets, sponsor items, and t-shirts.  A big thanks goes out to all who assisted us that night so that we could finish in just under 2 hours instead of taking all night.  My sleep schedule also thanks you. Morning of Stir Trek     After getting a decent amount of sleep I arrived at Marcus Crosswoods theater at 6am to begin setting up for the day.  Myself and Jody Morgan were in charge of registration so we got tables set up, laid out swag bags, and organized our volunteer crew to assist with checking-in attendees.  Despite having 600+ people registration went fairly smoothly and got the day off to a great start.  I especially appreciated the 3+ cups of coffee from Crimson Cup, a local coffee shop.  For any of you that know me you’ll know that I rarely drink coffee except a few times a year when I really need the energy, so that says a lot about how good their coffee is.   Conference Starts     Once registration was completed the day kicked off with Molly Holzschlag keynoting.  Unfortunately Molly suffered from an ear infection and wasn’t able to fly so she had a virtual keynote and a session later in the day.  I was working behind the scenes on various tasks so I was only able to drop in very briefly on the keynote and rest of the morning sessions.  Throughout the day I tried to grab at least 1 or 2 pics of each presenter.  See my album below for the full set of pics.      For lunch we ordered around 150 pizzas from Mellow Mushroom, a local pizza place (notice the theme of supporting local businesses.)  Early on we were concerned about Mellow Mushroom being able to supply that many pizzas and get them delivered (still hot) to the theater, but they did an excellent job day of the event.  I wish I had gotten some pictures of the old school VW van they delivered the pizza in, but I was just a bit busy running around trying to get theaters ready for lunch.  We had attendees from last year who specifically requested that we have Mellow Mushroom supply lunch this year and I’m glad everything worked out being able to use them again.     During the afternoon I was able to attend a few sessions and hear some great content from various speakers.  It was also nice to just sit down and get off my feet for a bit.  After the last sessions the day concluded with a raffle.  There were a few logistical and technical issues that hampered our ability to smoothly conduct the raffle.  To those of you that agree the raffle wasn’t the smoothest experience I would like to say that the Stir Trek planning committee has already begun meeting to discuss ways of improving the conference for next year.  We are also accepting feedback (both positive and negative) at the following link: click here.  If you don’t wish to use the Joind In site you can also email me directly and I’ll be sure to pass along the feedback.   Iron Man 2 Movie     Last but not least, what Stir Trek event would be complete without the feature movie.  This year’s movie was Iron Man 2.  The theater had some really cool props and promotions (see pic below) for the movie.  I really enjoyed Iron Man 2, but I would recommend brushing up on the Iron Man comics and Marvel’s plans for future movies to understand some of the plot elements that come up.  Also make sure you stay through to the end of the movie credits to see a sneak peak of something special, that’s all I’ll say. Conclusion     Again a big thanks goes out to all of the speakers, sponsors, attendees, movie theater staff, volunteers, and everyone else involved in making this event great.  Also big thanks to my fellow Stir Trek planning committee members: Jeff Blankenburg, Matt Casto, Carey Payette, Jody Morgan, Rick Kierner, and Sarah Dutkiewitcz.  I am grateful for everything I learned while helping plan this event and look forward to being involved again next year.  For those interested we are currently targeting Thor as our movie theme for 2011 and then The Avengers for 2012.  These are tentative based on release dates that could shift as we get closer, but for now look solid.   Photos Pics on Facebook (includes tagging)     Stir Trek: Iron Man Edition photos on Facebook Pics on Live site (higher res)      View Full Album         -Frog Out

    Read the article

  • SQLAuthority News – #SQLPASS 2012 Seattle Update – Memorylane 2009, 2010, 2011

    - by pinaldave
    Today is the first day of the SQLPASS 2012 and I will be soon posting SQL Server 2012 experience over here. Today when I landed in Seattle, I got the nostalgia feeling. I used to stay in the USA. I stayed here for more than 7 years – I studied here and I worked in USA. I had lots of friends in Seattle when I used to stay in the USA. I always wanted to visit Seattle because it is THE place. I remember once I purchased a ticket to travel to Seattle through Priceline (well it was the cheapest option and I was a student) but could not fly because of an interesting issue. I used to be Teaching Assistant of an advanced course and the professor asked me to build a pop-quiz for the course. I unfortunately had to cancel the trip. Before I returned to India – I pretty much covered every city existed in my list to must visit, except one – Seattle. It was so interesting that I never made it to Seattle even though I wanted to visit, when I was in USA. After that one time I never got a chance to travel to Seattle. After a few years I also returned to India for good. Once on Television I saw “Sleepless in Seattle” movie playing and I immediately changed the channel as it reminded me that I never made it to Seattle before. However, destiny has its own way to handle decisions. After I returned to India – I visited Seattle total of 5 times and this is my 6th visit to Seattle in less than 3 years. I was here for 3 previous SQLPASS events – 2009, 2010, and 2011 as well two Microsoft Most Valuable Professional Summit in 2009 and 2010. During these five trips I tried to catch up with all of my all friends but I realize that time has its own way of doing things. Many moved out of Seattle and many were too busy revive the old friendship but there were few who always make a point to meet me when I travel to the city. During the course of my visits I have made few fantastic new friends – Rick Morelan (Joes 2 Pros) and Greg Lynch. Every time I meet them I feel that I know them for years. I think city of Seattle has played very important part in our relationship that I got these fantastic friends. SQLPASS is the event where I find all of my SQL Friends and I look for this event for an entire year. This year’s my goal is to meet as many as new friends I can meet. If you are going to be at SQLPASS – FIND ME. I want to have a photo with you. I want to remember each name as I believe this is very important part of our life – making new friends and sustaining new friendship. Here are few of the pointers where you can find me. All Keynotes – Blogger’s Table Exhibition Booth Joes 2 Pros Booth #117 – Do not forget to stop by at the booth – I might have goodies for you – limited editions. Book Signing Events – Check details in tomorrow’s blog or stop by Booth #117 Evening Parties 6th Nov – Welcome Reception Evening Parties 7th Nov - Exhibitor Reception – Do not miss Booth #117 Evening Parties 8th Nov - Community Appreciation Party Additionally at few other locations – Embarcadero Booth In Coffee shops in Convention Center If you are SQLPASS – make sure that I find an opportunity to meet you at the event. Reserve a little time and lets have a coffee together. I will be continuously tweeting about my where about on twitter so let us stay connected on twitter. Here is my experience of my earlier experience of attending SQLPASS. SQLAuthority News – Book Signing Event – SQLPASS 2011 Event Log SQLAuthority News – Meeting SQL Friends – SQLPASS 2011 Event Log SQLAuthority News – Story of Seattle – SQLPASS 2011 Event Log SQLAuthority News – SQLPASS Nov 8-11, 2010-Seattle – An Alternative Look at Experience SQLAuthority News – Notes of Excellent Experience at SQL PASS 2009 Summit, Seattle Let us meet! Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • T-SQL Tuesday - the swag

    - by Rob Farley
    This month’s T-SQL Tuesday is hosted by Kendal van Dyke (@SQLDBA), and is on the topic of swag. He asks about the best SQL Server swag that we’ve ever received from a conference. I can’t say I ever focus on getting the swag at conferences, as I see some people doing. I know there are plenty of people that get around all the sponsors as soon as they’ve arrived, collecting whatever goodies they can, sometimes as token gifts for those at home, sometimes as giveaways for the user groups they attend. I remember a few years ago at my first PASS Summit, the SQLCAT team gave me a large pile of leftover SQL Server swag to give away to my user group – piles of branded things to stop your phone sliding off your car dashboard, and other things. The user group members thought it was great, and over the course of a few months, happily cleared me out of it all. I tend to consider swag to be something that you haven’t earned except by being at a conference, and there was no winning associated with it, it was simply a giveaway item at a sponsor booth. That means I don’t include the HP Mini laptop that was given away at TechEd Australia a few years ago to every attendee, or the SQL Server bag and Camelbak bottle that I was given as a thank-you for writing a guest blog post (which I use as my regular laptop bag and water bottle for work). I don’t even include the copy of Midtown Madness that I got as a door prize at my vey first TechEd event in 1999 (that was a really good game, and even meant that when I went to Chicago last year, I felt a strange familiarity about the place). I don’t want to include shirts in the mix either. I was given a nice SQL Server shirt about five years ago TechEd Australia. It’s a business shirt (buttons, cuffs, pocket on the chest), black with the SQL Server logo on it. It was such a nice shirt that I commented about it to the Product Marketing Manager for Australia (Christine, at the time), who unexpectedly arranged for me to get another one. That was certainly an improvement on the tent I was given at one of the MVP conference I attended. So when I consider these ‘rules’, two pieces of swag come to mind, and I think both were at PASS Summits (although I can’t be sure). One was a hand-warmer from HP, one of the “crystallisation-type” ones, which proved extremely popular when I got home, until one day when it didn’t survive being recharged – not overly SQL related, but still it was good swag. The other was an umbrella, from expressor, which was from the PASS Summit in 2010, my first PASS Summit. I remember it well – Blythe Morrow (now Gietz) (@blythemorrow) was working the booth, having stopped working for PASS some time before, but she’d been on my list of people to meet, as I’d had plenty of contact with her while she’d worked at PASS, my being a chapter leader and general volunteer. There had been an expressor dinner on one of the first evenings, which I’d been asked to be at, which is when I’d met lots of SQL people in person for the first time, including Ted Krueger (@onpnt), Jessica Moss (@jessicamoss) and Blythe. Anyway, at some point the next day I swung by their booth to say hello and thank them for the dinner, and Blythe says “Oh, we have the best swag – here!” and handed me an umbrella. And she was right. It’s excellent. @rob_farley

    Read the article

  • Data Connection Library in SharePoint Server 2010

    - by Wayne
    What is a Data Connection Library in SharePoint Server 2010? A Data Connection Library in Microsoft SharePoint Server 2010 is a library that can contain two kinds of data connections: an Office Data Connection (ODC) file or a Universal Data Connection (UDC) file. Microsoft InfoPath 2010 uses data connections that comply with the Universal Data Connection (UDC) file schema and typically have either a *.udcx or *.xml file name extension. Data sources described by these data connections are stored on the server and can be used in standard form templates and browser-enabled form templates. How to create a SharePoint Server Data Connection Library? 1. Browse to a SharePoint Server 2010 site on which you have at least Design permissions. If you are on the root site, create a new site before you continue with the next step. 2. On the Site Actions menu, click More Options. 3. On the Create page, click Library under Filter By, and then click Data Connection Library. 4. On the right side of the Create page, type a name for the library, and then click the Create button. 5. Copy the URL of the new data connection library. How to create a new data connection file in InfoPath? 1. Open InfoPath Designer 2010, click Blank Form, and then click Design Form. 2. On the Data tab, click Data Connections, and then click Add. 3. In the Data Connection Wizard, click Create a new connection to, click Receive data, and then click Next. 4. Click the kind of data source that you are connecting to, such as Database, Web service, or SharePoint library or list, and then click Next. 5. Complete the remaining steps in the Data Connection Wizard to configure your data connection, and then click Finish to return to the Data Connections dialog box. 6. In the Data Connections dialog box, click Convert to Connection File. 7. In the Convert Data Connection dialog box, enter the URL of the data connection library that you previously copied (delete "Forms/AllItems.aspx" and anything following it from the URL), enter a name for the data connection file at the end of the URL, and then click OK. It will take a few moments to convert and save the data connection file to the library. 8. Confirm that the data connection was converted successfully by examining the Details section of the Data Connections dialog box while the name of the converted data connection is selected. 9. Browse to the SharePoint data connection library, click the drop-down next to the name of the data connection, click Approve/Reject, click Approved, and then click OK. Take advantage of SharePoint Data Connection Library and other useful features of SharePoint family of products that include, SharePoint Foundation 2010, MOSS 2007 and free SharePoint templates or web parts.

    Read the article

  • How to move complete SharePoint Server 2007 from one box to another

    - by DipeshBhanani
    It was time of my first onsite client assignment on SharePoint. Client had one server production environment. They wanted to upgrade the topology with completely new SharePoint Farm of three servers. So, the task was to move whole MOSS 2007 stuff to the new server environment without impacting data. The last three scary words “… without impacting data…” were actually putting pressure on my head. Moreover SSP was required to move because additional information has been added for users apart from AD import.   I thought I had to do only backup and restore. It appeared pretty easy at first thought. Just because of these damn scary words, I thought to check out on internet for guidance related to this scenario. I couldn’t get anything except general guidance of moving server on Microsoft TechNet site. I promised myself for starting blogs with this post if I would be successful in this task. Well, I took long time to write this but finally made it. I hope it will be useful to all guys looking for SharePoint server movement.   Before beginning restoration, make sure that, there is no difference in versions of SharePoint at source and destination server. Also check whether the state of SharePoint Installation at the time of backup and restore is same or not. (E.g. SharePoint related service packs and patches if any)   The main tasks of the server movement are as follow:   Backup all the databases Install and configure SharePoint on new environment Deploy all solution (WSP Files) globally to destination server- for installing features attached to the solutions Install all the custom features Deploy/Copy custom pages/files which are added to the “12Hive” folder later Restore SSP Restore My Site Restore other web application   Tasks 3 to 5 are for making sure that we have configured the environment well enough for the web application to be restored successfully. The main and complex task was restoring SSP. I have started restoring SSP through Central Admin. After a while, the restoration status was updated to “unsuccessful”. “Damn it, what went wrong?” I thought looking at the error detail down the page. I couldn’t remember the error message but I had corrected and restored it again.   Actually once you fail restoring SSP, until and unless you don’t clean all related stuff well, your restoration will be failed again and again. I wanted to find the actual reason. So cleaned, restored, cleaned, restored… I had tried almost 5-6 times and finally, I succeeded. I had realized how pleasant it is, to see the word “Successful” on the screen. Without wasting your much time to read, let me write all the detailed steps of restoring SSP:   Delete the SSP through following STSADM command. stsadm -o deletessp -title <SSP name> -deletedatabases -force e.g.: stsadm -o deletessp -title SharedServices1 -deletedatabases –force Check and delete the web application associated with SSP if it exists. Remove Link from Check and remove “Alternate Access Mapping” associated with SSP if it exists. Check and delete IIS site as well as application pool associated with SSP if it exists. Stop following services: ·         Office SharePoint Server Search ·         Windows SharePoint Services Search ·         Windows SharePoint Services Help Search Delete all the databases associated/related to SSP from SQL Server. Reset IIS. Start again following services: ·         Office SharePoint Server Search ·         Windows SharePoint Services Search ·         Windows SharePoint Services Help Search Restore the new SSP.   After the SSP restoration, all other stuffs had completed very smoothly without any more issues. I did few modifications to sites for change of server name and finally, the new environment was ready.

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40  | Next Page >