Search Results

Search found 29412 results on 1177 pages for 'microsoft office online'.

Page 267/1177 | < Previous Page | 263 264 265 266 267 268 269 270 271 272 273 274  | Next Page >

  • New features of C# 4.0

    This article covers New features of C# 4.0. Article has been divided into below sections. Introduction. Dynamic Lookup. Named and Optional Arguments. Features for COM interop. Variance. Relationship with Visual Basic. Resources. Other interested readings… 22 New Features of Visual Studio 2008 for .NET Professionals 50 New Features of SQL Server 2008 IIS 7.0 New features Introduction It is now close to a year since Microsoft Visual C# 3.0 shipped as part of Visual Studio 2008. In the VS Managed Languages team we are hard at work on creating the next version of the language (with the unsurprising working title of C# 4.0), and this document is a first public description of the planned language features as we currently see them. Please be advised that all this is in early stages of production and is subject to change. Part of the reason for sharing our plans in public so early is precisely to get the kind of feedback that will cause us to improve the final product before it rolls out. Simultaneously with the publication of this whitepaper, a first public CTP (community technology preview) of Visual Studio 2010 is going out as a Virtual PC image for everyone to try. Please use it to play and experiment with the features, and let us know of any thoughts you have. We ask for your understanding and patience working with very early bits, where especially new or newly implemented features do not have the quality or stability of a final product. The aim of the CTP is not to give you a productive work environment but to give you the best possible impression of what we are working on for the next release. The CTP contains a number of walkthroughs, some of which highlight the new language features of C# 4.0. Those are excellent for getting a hands-on guided tour through the details of some common scenarios for the features. You may consider this whitepaper a companion document to these walkthroughs, complementing them with a focus on the overall language features and how they work, as opposed to the specifics of the concrete scenarios. C# 4.0 The major theme for C# 4.0 is dynamic programming. Increasingly, objects are “dynamic” in the sense that their structure and behavior is not captured by a static type, or at least not one that the compiler knows about when compiling your program. Some examples include a. objects from dynamic programming languages, such as Python or Ruby b. COM objects accessed through IDispatch c. ordinary .NET types accessed through reflection d. objects with changing structure, such as HTML DOM objects While C# remains a statically typed language, we aim to vastly improve the interaction with such objects. A secondary theme is co-evolution with Visual Basic. Going forward we will aim to maintain the individual character of each language, but at the same time important new features should be introduced in both languages at the same time. They should be differentiated more by style and feel than by feature set. The new features in C# 4.0 fall into four groups: Dynamic lookup Dynamic lookup allows you to write method, operator and indexer calls, property and field accesses, and even object invocations which bypass the C# static type checking and instead gets resolved at runtime. Named and optional parameters Parameters in C# can now be specified as optional by providing a default value for them in a member declaration. When the member is invoked, optional arguments can be omitted. Furthermore, any argument can be passed by parameter name instead of position. COM specific interop features Dynamic lookup as well as named and optional parameters both help making programming against COM less painful than today. On top of that, however, we are adding a number of other small features that further improve the interop experience. Variance It used to be that an IEnumerable<string> wasn’t an IEnumerable<object>. Now it is – C# embraces type safe “co-and contravariance” and common BCL types are updated to take advantage of that. Dynamic Lookup Dynamic lookup allows you a unified approach to invoking things dynamically. With dynamic lookup, when you have an object in your hand you do not need to worry about whether it comes from COM, IronPython, the HTML DOM or reflection; you just apply operations to it and leave it to the runtime to figure out what exactly those operations mean for that particular object. This affords you enormous flexibility, and can greatly simplify your code, but it does come with a significant drawback: Static typing is not maintained for these operations. A dynamic object is assumed at compile time to support any operation, and only at runtime will you get an error if it wasn’t so. Oftentimes this will be no loss, because the object wouldn’t have a static type anyway, in other cases it is a tradeoff between brevity and safety. In order to facilitate this tradeoff, it is a design goal of C# to allow you to opt in or opt out of dynamic behavior on every single call. The dynamic type C# 4.0 introduces a new static type called dynamic. When you have an object of type dynamic you can “do things to it” that are resolved only at runtime: dynamic d = GetDynamicObject(…); d.M(7); The C# compiler allows you to call a method with any name and any arguments on d because it is of type dynamic. At runtime the actual object that d refers to will be examined to determine what it means to “call M with an int” on it. The type dynamic can be thought of as a special version of the type object, which signals that the object can be used dynamically. It is easy to opt in or out of dynamic behavior: any object can be implicitly converted to dynamic, “suspending belief” until runtime. Conversely, there is an “assignment conversion” from dynamic to any other type, which allows implicit conversion in assignment-like constructs: dynamic d = 7; // implicit conversion int i = d; // assignment conversion Dynamic operations Not only method calls, but also field and property accesses, indexer and operator calls and even delegate invocations can be dispatched dynamically: dynamic d = GetDynamicObject(…); d.M(7); // calling methods d.f = d.P; // getting and settings fields and properties d[“one”] = d[“two”]; // getting and setting thorugh indexers int i = d + 3; // calling operators string s = d(5,7); // invoking as a delegate The role of the C# compiler here is simply to package up the necessary information about “what is being done to d”, so that the runtime can pick it up and determine what the exact meaning of it is given an actual object d. Think of it as deferring part of the compiler’s job to runtime. The result of any dynamic operation is itself of type dynamic. Runtime lookup At runtime a dynamic operation is dispatched according to the nature of its target object d: COM objects If d is a COM object, the operation is dispatched dynamically through COM IDispatch. This allows calling to COM types that don’t have a Primary Interop Assembly (PIA), and relying on COM features that don’t have a counterpart in C#, such as indexed properties and default properties. Dynamic objects If d implements the interface IDynamicObject d itself is asked to perform the operation. Thus by implementing IDynamicObject a type can completely redefine the meaning of dynamic operations. This is used intensively by dynamic languages such as IronPython and IronRuby to implement their own dynamic object models. It will also be used by APIs, e.g. by the HTML DOM to allow direct access to the object’s properties using property syntax. Plain objects Otherwise d is a standard .NET object, and the operation will be dispatched using reflection on its type and a C# “runtime binder” which implements C#’s lookup and overload resolution semantics at runtime. This is essentially a part of the C# compiler running as a runtime component to “finish the work” on dynamic operations that was deferred by the static compiler. Example Assume the following code: dynamic d1 = new Foo(); dynamic d2 = new Bar(); string s; d1.M(s, d2, 3, null); Because the receiver of the call to M is dynamic, the C# compiler does not try to resolve the meaning of the call. Instead it stashes away information for the runtime about the call. This information (often referred to as the “payload”) is essentially equivalent to: “Perform an instance method call of M with the following arguments: 1. a string 2. a dynamic 3. a literal int 3 4. a literal object null” At runtime, assume that the actual type Foo of d1 is not a COM type and does not implement IDynamicObject. In this case the C# runtime binder picks up to finish the overload resolution job based on runtime type information, proceeding as follows: 1. Reflection is used to obtain the actual runtime types of the two objects, d1 and d2, that did not have a static type (or rather had the static type dynamic). The result is Foo for d1 and Bar for d2. 2. Method lookup and overload resolution is performed on the type Foo with the call M(string,Bar,3,null) using ordinary C# semantics. 3. If the method is found it is invoked; otherwise a runtime exception is thrown. Overload resolution with dynamic arguments Even if the receiver of a method call is of a static type, overload resolution can still happen at runtime. This can happen if one or more of the arguments have the type dynamic: Foo foo = new Foo(); dynamic d = new Bar(); var result = foo.M(d); The C# runtime binder will choose between the statically known overloads of M on Foo, based on the runtime type of d, namely Bar. The result is again of type dynamic. The Dynamic Language Runtime An important component in the underlying implementation of dynamic lookup is the Dynamic Language Runtime (DLR), which is a new API in .NET 4.0. The DLR provides most of the infrastructure behind not only C# dynamic lookup but also the implementation of several dynamic programming languages on .NET, such as IronPython and IronRuby. Through this common infrastructure a high degree of interoperability is ensured, but just as importantly the DLR provides excellent caching mechanisms which serve to greatly enhance the efficiency of runtime dispatch. To the user of dynamic lookup in C#, the DLR is invisible except for the improved efficiency. However, if you want to implement your own dynamically dispatched objects, the IDynamicObject interface allows you to interoperate with the DLR and plug in your own behavior. This is a rather advanced task, which requires you to understand a good deal more about the inner workings of the DLR. For API writers, however, it can definitely be worth the trouble in order to vastly improve the usability of e.g. a library representing an inherently dynamic domain. Open issues There are a few limitations and things that might work differently than you would expect. · The DLR allows objects to be created from objects that represent classes. However, the current implementation of C# doesn’t have syntax to support this. · Dynamic lookup will not be able to find extension methods. Whether extension methods apply or not depends on the static context of the call (i.e. which using clauses occur), and this context information is not currently kept as part of the payload. · Anonymous functions (i.e. lambda expressions) cannot appear as arguments to a dynamic method call. The compiler cannot bind (i.e. “understand”) an anonymous function without knowing what type it is converted to. One consequence of these limitations is that you cannot easily use LINQ queries over dynamic objects: dynamic collection = …; var result = collection.Select(e => e + 5); If the Select method is an extension method, dynamic lookup will not find it. Even if it is an instance method, the above does not compile, because a lambda expression cannot be passed as an argument to a dynamic operation. There are no plans to address these limitations in C# 4.0. Named and Optional Arguments Named and optional parameters are really two distinct features, but are often useful together. Optional parameters allow you to omit arguments to member invocations, whereas named arguments is a way to provide an argument using the name of the corresponding parameter instead of relying on its position in the parameter list. Some APIs, most notably COM interfaces such as the Office automation APIs, are written specifically with named and optional parameters in mind. Up until now it has been very painful to call into these APIs from C#, with sometimes as many as thirty arguments having to be explicitly passed, most of which have reasonable default values and could be omitted. Even in APIs for .NET however you sometimes find yourself compelled to write many overloads of a method with different combinations of parameters, in order to provide maximum usability to the callers. Optional parameters are a useful alternative for these situations. Optional parameters A parameter is declared optional simply by providing a default value for it: public void M(int x, int y = 5, int z = 7); Here y and z are optional parameters and can be omitted in calls: M(1, 2, 3); // ordinary call of M M(1, 2); // omitting z – equivalent to M(1, 2, 7) M(1); // omitting both y and z – equivalent to M(1, 5, 7) Named and optional arguments C# 4.0 does not permit you to omit arguments between commas as in M(1,,3). This could lead to highly unreadable comma-counting code. Instead any argument can be passed by name. Thus if you want to omit only y from a call of M you can write: M(1, z: 3); // passing z by name or M(x: 1, z: 3); // passing both x and z by name or even M(z: 3, x: 1); // reversing the order of arguments All forms are equivalent, except that arguments are always evaluated in the order they appear, so in the last example the 3 is evaluated before the 1. Optional and named arguments can be used not only with methods but also with indexers and constructors. Overload resolution Named and optional arguments affect overload resolution, but the changes are relatively simple: A signature is applicable if all its parameters are either optional or have exactly one corresponding argument (by name or position) in the call which is convertible to the parameter type. Betterness rules on conversions are only applied for arguments that are explicitly given – omitted optional arguments are ignored for betterness purposes. If two signatures are equally good, one that does not omit optional parameters is preferred. M(string s, int i = 1); M(object o); M(int i, string s = “Hello”); M(int i); M(5); Given these overloads, we can see the working of the rules above. M(string,int) is not applicable because 5 doesn’t convert to string. M(int,string) is applicable because its second parameter is optional, and so, obviously are M(object) and M(int). M(int,string) and M(int) are both better than M(object) because the conversion from 5 to int is better than the conversion from 5 to object. Finally M(int) is better than M(int,string) because no optional arguments are omitted. Thus the method that gets called is M(int). Features for COM interop Dynamic lookup as well as named and optional parameters greatly improve the experience of interoperating with COM APIs such as the Office Automation APIs. In order to remove even more of the speed bumps, a couple of small COM-specific features are also added to C# 4.0. Dynamic import Many COM methods accept and return variant types, which are represented in the PIAs as object. In the vast majority of cases, a programmer calling these methods already knows the static type of a returned object from context, but explicitly has to perform a cast on the returned value to make use of that knowledge. These casts are so common that they constitute a major nuisance. In order to facilitate a smoother experience, you can now choose to import these COM APIs in such a way that variants are instead represented using the type dynamic. In other words, from your point of view, COM signatures now have occurrences of dynamic instead of object in them. This means that you can easily access members directly off a returned object, or you can assign it to a strongly typed local variable without having to cast. To illustrate, you can now say excel.Cells[1, 1].Value = "Hello"; instead of ((Excel.Range)excel.Cells[1, 1]).Value2 = "Hello"; and Excel.Range range = excel.Cells[1, 1]; instead of Excel.Range range = (Excel.Range)excel.Cells[1, 1]; Compiling without PIAs Primary Interop Assemblies are large .NET assemblies generated from COM interfaces to facilitate strongly typed interoperability. They provide great support at design time, where your experience of the interop is as good as if the types where really defined in .NET. However, at runtime these large assemblies can easily bloat your program, and also cause versioning issues because they are distributed independently of your application. The no-PIA feature allows you to continue to use PIAs at design time without having them around at runtime. Instead, the C# compiler will bake the small part of the PIA that a program actually uses directly into its assembly. At runtime the PIA does not have to be loaded. Omitting ref Because of a different programming model, many COM APIs contain a lot of reference parameters. Contrary to refs in C#, these are typically not meant to mutate a passed-in argument for the subsequent benefit of the caller, but are simply another way of passing value parameters. It therefore seems unreasonable that a C# programmer should have to create temporary variables for all such ref parameters and pass these by reference. Instead, specifically for COM methods, the C# compiler will allow you to pass arguments by value to such a method, and will automatically generate temporary variables to hold the passed-in values, subsequently discarding these when the call returns. In this way the caller sees value semantics, and will not experience any side effects, but the called method still gets a reference. Open issues A few COM interface features still are not surfaced in C#. Most notably these include indexed properties and default properties. As mentioned above these will be respected if you access COM dynamically, but statically typed C# code will still not recognize them. There are currently no plans to address these remaining speed bumps in C# 4.0. Variance An aspect of generics that often comes across as surprising is that the following is illegal: IList<string> strings = new List<string>(); IList<object> objects = strings; The second assignment is disallowed because strings does not have the same element type as objects. There is a perfectly good reason for this. If it were allowed you could write: objects[0] = 5; string s = strings[0]; Allowing an int to be inserted into a list of strings and subsequently extracted as a string. This would be a breach of type safety. However, there are certain interfaces where the above cannot occur, notably where there is no way to insert an object into the collection. Such an interface is IEnumerable<T>. If instead you say: IEnumerable<object> objects = strings; There is no way we can put the wrong kind of thing into strings through objects, because objects doesn’t have a method that takes an element in. Variance is about allowing assignments such as this in cases where it is safe. The result is that a lot of situations that were previously surprising now just work. Covariance In .NET 4.0 the IEnumerable<T> interface will be declared in the following way: public interface IEnumerable<out T> : IEnumerable { IEnumerator<T> GetEnumerator(); } public interface IEnumerator<out T> : IEnumerator { bool MoveNext(); T Current { get; } } The “out” in these declarations signifies that the T can only occur in output position in the interface – the compiler will complain otherwise. In return for this restriction, the interface becomes “covariant” in T, which means that an IEnumerable<A> is considered an IEnumerable<B> if A has a reference conversion to B. As a result, any sequence of strings is also e.g. a sequence of objects. This is useful e.g. in many LINQ methods. Using the declarations above: var result = strings.Union(objects); // succeeds with an IEnumerable<object> This would previously have been disallowed, and you would have had to to some cumbersome wrapping to get the two sequences to have the same element type. Contravariance Type parameters can also have an “in” modifier, restricting them to occur only in input positions. An example is IComparer<T>: public interface IComparer<in T> { public int Compare(T left, T right); } The somewhat baffling result is that an IComparer<object> can in fact be considered an IComparer<string>! It makes sense when you think about it: If a comparer can compare any two objects, it can certainly also compare two strings. This property is referred to as contravariance. A generic type can have both in and out modifiers on its type parameters, as is the case with the Func<…> delegate types: public delegate TResult Func<in TArg, out TResult>(TArg arg); Obviously the argument only ever comes in, and the result only ever comes out. Therefore a Func<object,string> can in fact be used as a Func<string,object>. Limitations Variant type parameters can only be declared on interfaces and delegate types, due to a restriction in the CLR. Variance only applies when there is a reference conversion between the type arguments. For instance, an IEnumerable<int> is not an IEnumerable<object> because the conversion from int to object is a boxing conversion, not a reference conversion. Also please note that the CTP does not contain the new versions of the .NET types mentioned above. In order to experiment with variance you have to declare your own variant interfaces and delegate types. COM Example Here is a larger Office automation example that shows many of the new C# features in action. using System; using System.Diagnostics; using System.Linq; using Excel = Microsoft.Office.Interop.Excel; using Word = Microsoft.Office.Interop.Word; class Program { static void Main(string[] args) { var excel = new Excel.Application(); excel.Visible = true; excel.Workbooks.Add(); // optional arguments omitted excel.Cells[1, 1].Value = "Process Name"; // no casts; Value dynamically excel.Cells[1, 2].Value = "Memory Usage"; // accessed var processes = Process.GetProcesses() .OrderByDescending(p =&gt; p.WorkingSet) .Take(10); int i = 2; foreach (var p in processes) { excel.Cells[i, 1].Value = p.ProcessName; // no casts excel.Cells[i, 2].Value = p.WorkingSet; // no casts i++; } Excel.Range range = excel.Cells[1, 1]; // no casts Excel.Chart chart = excel.ActiveWorkbook.Charts. Add(After: excel.ActiveSheet); // named and optional arguments chart.ChartWizard( Source: range.CurrentRegion, Title: "Memory Usage in " + Environment.MachineName); //named+optional chart.ChartStyle = 45; chart.CopyPicture(Excel.XlPictureAppearance.xlScreen, Excel.XlCopyPictureFormat.xlBitmap, Excel.XlPictureAppearance.xlScreen); var word = new Word.Application(); word.Visible = true; word.Documents.Add(); // optional arguments word.Selection.Paste(); } } The code is much more terse and readable than the C# 3.0 counterpart. Note especially how the Value property is accessed dynamically. This is actually an indexed property, i.e. a property that takes an argument; something which C# does not understand. However the argument is optional. Since the access is dynamic, it goes through the runtime COM binder which knows to substitute the default value and call the indexed property. Thus, dynamic COM allows you to avoid accesses to the puzzling Value2 property of Excel ranges. Relationship with Visual Basic A number of the features introduced to C# 4.0 already exist or will be introduced in some form or other in Visual Basic: · Late binding in VB is similar in many ways to dynamic lookup in C#, and can be expected to make more use of the DLR in the future, leading to further parity with C#. · Named and optional arguments have been part of Visual Basic for a long time, and the C# version of the feature is explicitly engineered with maximal VB interoperability in mind. · NoPIA and variance are both being introduced to VB and C# at the same time. VB in turn is adding a number of features that have hitherto been a mainstay of C#. As a result future versions of C# and VB will have much better feature parity, for the benefit of everyone. Resources All available resources concerning C# 4.0 can be accessed through the C# Dev Center. Specifically, this white paper and other resources can be found at the Code Gallery site. Enjoy! span.fullpost {display:none;}

    Read the article

  • MSBuild cannot find SGen when compiling a solution

    - by Jaxidian
    I've looked at several other SGen-related questions on here and either their answers don't apply or their answers don't fix this for me. I have installed several SDKs to fix this issue with no luck. Reference types should not be changed since this is the only place this is a problem. Once suggestion is to put SGen.exe into the C:\Windows\Microsoft.NET\Framework\v3.5 folder, but that's not been done on the box where this is not a problem. In this scenario, SGen.exe actually exists and is right where it's supposed to be, but MSBuild still is having issues with finding it for some reason! Background: We have a NAnt script that automates our builds. In this scenario, NAnt is calling MSBuild and MSBuild is generating the error claiming to be unable to find SGen. The project is .NET 3.5-based. I have my primary dev environment (64-bit Vista Ultimate) where the script works perfectly and I am attempting to duplicate it in a VM (64-bit Win 7 Ultimate). I THINK I have everything to the point where I should be good-to-go but this fails on the Win7 box (works perfectly on the Vista box). I have done some comparisons between the two boxes and they both look identical in this regard, but it still fails. For example, the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework's sdkInstallRootv2.0 value is set to C:\Program Files\Microsoft.NET\SDK\v2.0 64bit\ on both machines. In both machines, SGen.exe is in that path's bin subdirectory. NAnt Script: <target name="report-installer" depends="fail-if-environment-not-set"> <exec program="MSBuild.exe" basedir="${framework35.directory}"> <arg value="${tools.directory.current}\ReportInstaller\ReportInstaller.sln" /> <arg value="/p:Configuration=${buildconfiguration.current}" /> </exec> </target> The error message I get is this: report-installer: [exec] Microsoft (R) Build Engine Version 3.5.30729.4926 [exec] [Microsoft .NET Framework, Version 2.0.50727.4927] [exec] Copyright (C) Microsoft Corporation 2007. All rights reserved. [exec] [exec] Build started 4/8/2010 11:28:23 AM. [exec] Project "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.sln" on node 0 (default targets). [exec] Building solution configuration "Release|Any CPU". [exec] Project "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.sln" (1) is building "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.csproj" (2) on node 0 (default targets). [exec] Could not locate the .NET Framework SDK. The task is looking for the path to the .NET Framework SDK at the location specified in the SDKInstallRootv2.0 value of the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework. You may be able to solve the problem by doing one of the following: 1.) Install the .NET Framework SDK. 2.) Manually set the above registry key to the correct location. [exec] CoreCompile: [exec] Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files. [exec] C:\Windows\Microsoft.NET\Framework\v2.0.50727\Microsoft.Common.targets(1902,9): error MSB3091: Task failed because "sgen.exe" was not found, or the .NET Framework SDK v2.0 is not installed. The task is looking for "sgen.exe" in the "bin" subdirectory beneath the location specified in the SDKInstallRootv2.0 value of the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework. You may be able to solve the problem by doing one of the following: 1.) Install the .NET Framework SDK v2.0. 2.) Manually set the above registry key to the correct location. 3.) Pass the correct location into the "ToolPath" parameter of the task. [exec] Done Building Project "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.csproj" (default targets) -- FAILED. [exec] Done Building Project "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.sln" (default targets) -- FAILED. [exec] [exec] Build FAILED. [exec] [exec] "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.sln" (default target) (1) -> [exec] "C:\Projects\Production\Tools\ReportInstaller\ReportInstaller.csproj" (default target) (2) -> [exec] (GenerateSerializationAssemblies target) -> [exec] C:\Windows\Microsoft.NET\Framework\v2.0.50727\Microsoft.Common.targets(1902,9): error MSB3091: Task failed because "sgen.exe" was not found, or the .NET Framework SDK v2.0 is not installed. The task is looking for "sgen.exe" in the "bin" subdirectory beneath the location specified in the SDKInstallRootv2.0 value of the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework. You may be able to solve the problem by doing one of the following: 1.) Install the .NET Framework SDK v2.0. 2.) Manually set the above registry key to the correct location. 3.) Pass the correct location into the "ToolPath" parameter of the task. [exec] [exec] 0 Warning(s) [exec] 1 Error(s) [exec] [exec] Time Elapsed 00:00:00.24 [call] C:\Projects\Production\Source\reports.build(15,4): [call] External Program Failed: C:\Windows\Microsoft.NET\Framework\v3.5\MSBuild.exe (return code was 1) What am I doing wrong here that is causing MSBuild to STILL be unable to find SGen?

    Read the article

  • .net framework sdk version(csc.exe) using vcbuild.exe on the command line.

    - by r9r9r9
    I create a c# class library project named: testVcBuild, then use vcbuild.exe to build the project in the command line like: C:\Program Files\Microsoft Visual Studio 9.0\VC\vcpackages>vcbuild testVcBuild.csproj "Debug|Win32" the out put shows: Microsoft (R) Visual C++ Project Builder - Command Line Version 9.00.21022 Copyright (C) Microsoft Corporation. All rights reserved. Microsoft (R) Build Engine Version 2.0.50727.4927 [Microsoft .NET Framework, Version 2.0.50727.4927] Copyright (C) Microsoft Corporation 2005. All rights reserved. I found that the vcbuild.exe always call the "C:\Windows\Microsoft.NET\Framework\v2.0.50727\Csc.exe /noconfig .." the problem is how can I change the Framework version to v3.5? I found my project works fine with the v3.5 but it's broken in the v2.0.50727. I try to use msbuild.exe instead of vcbuild.exe, everything goes well, I just don't understand how can I make it with the vcbuild.exe? win7+vs2005+vs2008 installed.

    Read the article

  • Installing Loadrunner

    - by grouchomarx
    Well, it figures that Loadrunner is not easy to install. Apprently, we need: ? .NET Framework 3.5 ? Microsoft Data Access Components (MDAC) 2.8 SP1 (or later) ? Microsoft Windows Installer 3.1 ? Microsoft Core XML Services (MSXML) 6.0 ? Microsoft Visual C++ 2005 Redistributable Package (x86) ? Microsoft Visual C++ 2008 Redistributable Package (x86) ? Web Services Enhancements (WSE) 2.0 SP3 for Microsoft .NET Redistributable Runtime MSI ? Web Services Enhancements (WSE) 3.0 for Microsoft .NET Redistributable Runtime MSI Once done, I'm still trying to figure out how to execute this thing. Has anyone gone through the same experience with it ?

    Read the article

  • how to specify the Build Engine Version when using VcBuild.exe on the command line.

    - by r9r9r9
    I create a c# class library project named: testVcBuild, then use vcbuild.exe to build the project in the command line like: C:\Program Files\Microsoft Visual Studio 9.0\VC\vcpackages>vcbuild testVcBuild.csproj "Debug|Win32" the out put shows: Microsoft (R) Visual C++ Project Builder - Command Line Version 9.00.21022 Copyright (C) Microsoft Corporation. All rights reserved. Microsoft (R) Build Engine Version 2.0.50727.4927 [Microsoft .NET Framework, Version 2.0.50727.4927] Copyright (C) Microsoft Corporation 2005. All rights reserved. I found that the vcbuild.exe always call the "C:\Windows\Microsoft.NET\Framework\v2.0.50727\Csc.exe /noconfig .." the problem is how can I change the Framework version to v3.5? I found my project works fine with the v3.5 but it's broken in the v2.0.50727. I try to use msbuild.exe instead of vcbuild.exe, everything goes well, I just don't understand how can I make it with the vcbuild.exe? win7+vs2005+vs2008 installed.

    Read the article

  • Compile error C++: could not deduce template argument for 'T'

    - by OneShot
    I'm trying to read binary data to load structs back into memory so I can edit them and save them back to the .dat file. readVector() attempts to read the file, and return the vectors that were serialized. But i'm getting this compile error when I try and run it. What am I doing wrong with my templates? ***** EDIT ************** Code: // Project 5.cpp : main project file. #include "stdafx.h" #include <iostream> #include <fstream> #include <string> #include <vector> #include <algorithm> using namespace System; using namespace std; #pragma hdrstop int checkCommand (string line); template<typename T> void writeVector(ofstream &out, const vector<T> &vec); template<typename T> vector<T> readVector(ifstream &in); struct InventoryItem { string Item; string Description; int Quantity; int wholesaleCost; int retailCost; int dateAdded; } ; int main(void) { cout << "Welcome to the Inventory Manager extreme! [Version 1.0]" << endl; ifstream in("data.dat"); vector<InventoryItem> structList; readVector<InventoryItem>( in ); while (1) { string line = ""; cout << endl; cout << "Commands: " << endl; cout << "1: Add a new record " << endl; cout << "2: Display a record " << endl; cout << "3: Edit a current record " << endl; cout << "4: Exit the program " << endl; cout << endl; cout << "Enter a command 1-4: "; getline(cin , line); int rValue = checkCommand(line); if (rValue == 1) { cout << "You've entered a invalid command! Try Again." << endl; } else if (rValue == 2){ cout << "Error calling command!" << endl; } else if (!rValue) { break; } } system("pause"); return 0; } int checkCommand (string line) { int intReturn = atoi(line.c_str()); int status = 3; switch (intReturn) { case 1: break; case 2: break; case 3: break; case 4: status = 0; break; default: status = 1; break; } return status; } template<typename T> void writeVector(ofstream &out, const vector<T> &vec) { out << vec.size(); for(vector<T>::const_iterator i = vec.begin(); i != vec.end(); i++) { out << *i; } } ostream& operator<<(std::ostream &strm, const InventoryItem &i) { return strm << i.Item << " (" << i.Description << ")"; } template<typename T> vector<T> readVector(ifstream &in) { size_t size; in >> size; vector<T> vec; vec.reserve(size); for(int i = 0; i < size; i++) { T tmp; in >> tmp; vec.push_back(tmp); } return vec; } Compiler errors: 1>------ Build started: Project: Project 5, Configuration: Debug Win32 ------ 1>Compiling... 1>Project 5.cpp 1>.\Project 5.cpp(124) : warning C4018: '<' : signed/unsigned mismatch 1> .\Project 5.cpp(40) : see reference to function template instantiation 'std::vector<_Ty> readVector<InventoryItem>(std::ifstream &)' being compiled 1> with 1> [ 1> _Ty=InventoryItem 1> ] 1>.\Project 5.cpp(127) : error C2679: binary '>>' : no operator found which takes a right-hand operand of type 'InventoryItem' (or there is no acceptable conversion) 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(1144): could be 'std::basic_istream<_Elem,_Traits> &std::operator >><std::char_traits<char>>(std::basic_istream<_Elem,_Traits> &,signed char *)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(1146): or 'std::basic_istream<_Elem,_Traits> &std::operator >><std::char_traits<char>>(std::basic_istream<_Elem,_Traits> &,signed char &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(1148): or 'std::basic_istream<_Elem,_Traits> &std::operator >><std::char_traits<char>>(std::basic_istream<_Elem,_Traits> &,unsigned char *)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(1150): or 'std::basic_istream<_Elem,_Traits> &std::operator >><std::char_traits<char>>(std::basic_istream<_Elem,_Traits> &,unsigned char &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(155): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(std::basic_istream<_Elem,_Traits> &(__cdecl *)(std::basic_istream<_Elem,_Traits> &))' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(161): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(std::basic_ios<_Elem,_Traits> &(__cdecl *)(std::basic_ios<_Elem,_Traits> &))' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(168): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(std::ios_base &(__cdecl *)(std::ios_base &))' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(175): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(std::_Bool &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(194): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(short &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(228): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(unsigned short &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(247): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(int &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(273): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(unsigned int &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(291): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(long &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(309): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(__w64 unsigned long &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(329): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(__int64 &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(348): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(unsigned __int64 &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(367): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(float &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(386): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(double &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(404): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(long double &)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(422): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(void *&)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\include\istream(441): or 'std::basic_istream<_Elem,_Traits> &std::basic_istream<_Elem,_Traits>::operator >>(std::basic_streambuf<_Elem,_Traits> *)' 1> with 1> [ 1> _Elem=char, 1> _Traits=std::char_traits<char> 1> ] 1> while trying to match the argument list '(std::ifstream, InventoryItem)' 1>Build log was saved at "file://c:\Users\Owner\Documents\Visual Studio 2008\Projects\Project 5\Project 5\Debug\BuildLog.htm" 1>Project 5 - 1 error(s), 1 warning(s) ========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== Oh my god...I fixed that error I think and now I got another one. Will you PLEASE just help me on this one too! What the heck does this mean ??

    Read the article

  • Problem with connection to MS SQL Server database using SSMS

    - by Charles
    I have a database on line with Godaddy (who uses SQL Server 2005). They provide basic management tools, but tell you that for more advanced tools you can connect directly using SSMS. I followed their instructions to ensure my online database will accept remote connections, and can apparently log in using SSMS with success (after giving my hostname and access data). However: When attempting to expand the "Databases" folder tree, I get the following error: Failed to retrieve data for this request. (Microsoft.SqlServer.Management.Sdk.Sfc) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server& LinkId=20476 ADDITIONAL INFORMATION: An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo) The server principal "cmitchell" is not able to access the database "3pointdb" under the current security context. (Microsoft SQL Server, Error: 916) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.4262&EvtSrc=MSSQLServer&EvtID=916&LinkId=20476

    Read the article

  • How to detect the language of MS Excel from C#

    - by Babba
    If i try to use Excel from C# (interop) i receive error (HRESULT: 0x80028018) when current thread language is different from Excel language: so i need to set thread language, they must be the same. Which is the best method to understand the language of Excel/Office? 1) registry (HKEY_CURRENT_USER\Software\Microsoft\Office\12.0\Common\LanguageResources\UILanguage ? How understand wich version (12.0/14.0/...?) 2) with Application like suggested here (http://stackoverflow.com/questions/2804556/how-to-detect-the-language-of-ms-excel) ? It's ok but i need a strong reference to a specific version Microsoft.Office.Core and so i can't do it for different versione of Office: Excel.Application application = new Excel.Application(); int iUi = application.LanguageSettings.get_LanguageID(Microsoft.Office.Core.MsoAppLanguageID.msoLanguageIDUI); MessageBox.Show(iUi.ToString()); 3) other?

    Read the article

  • C# Excel Exception

    - by Andrew James Watt
    I am trying to copy all data from worksheet1 and paste the values into worksheet2 at the same position I am using office 2003 and the Interlop library. Here is my code; public void CreateExcelWorksheet() { Microsoft.Office.Interop.Excel.Application xlApp = new Microsoft.Office.Interop.Excel.Application(); if (xlApp == null) { Console.WriteLine("EXCEL could not be started. Check that your office installation."); return; } xlApp.Visible = true; Workbook wb = xlApp.Workbooks.OpenXML(@"C:\XML_Export.xml", Type.Missing, 2); Worksheet worksheet1 = wb.Worksheets[1] as Worksheet; Worksheet worksheet2 = wb.Worksheets[2] as Worksheet; worksheet1.UsedRange.Copy(Type.Missing); worksheet2.PasteSpecial(Microsoft.Office.Interop.Excel.XlPasteType.xlPasteValues, false, false, Type.Missing, Type.Missing, Type.Missing, Type.Missing); } For some reason after the paste command the following exception occurs: Exception from HRESULT: 0x800A03EC Could anyone help?

    Read the article

  • Is Microsoft's Ribbon UI really that great, from a usability perspective?

    - by Thomas Owens
    The first time I ever used it was at my current job. Among my coworkers, the feelings toward it for usability are mixed. The other developer doesn't really care one way or the other, as long as Office does everything he needs it to do when writing reports. The top manager likes it because it feels natural, and I feel the same way. But another coworker finds in klunky and hard to use (although she admits that she only uses it at home as her machine hasn't been upgraded yet, and that might change if she uses it more often at work). So - is the Ribbon UI really that innovative? What qualities about it make it a good or bad user interface mechanism? Possibly related: Adoption of the Ribbon UI

    Read the article

  • Windows 8 Launch&ndash;Why OEM and Retailers Should STFU

    - by D'Arcy Lussier
    Microsoft has gotten a lot of flack for the Surface from OEM/hardware partners who create Windows-based devices and I’m sure, to an extent, retailers who normally stock and sell Windows-based devices. I mean we all know how this is supposed to work – Microsoft makes the OS, partners make the hardware, retailers sell the hardware. Now Microsoft is breaking the rules by not only offering their own hardware but selling them via online and through their Microsoft branded stores! The thought has been that Microsoft is trying to set a standard for the other hardware companies to reach for. Maybe. I hope, at some level, Microsoft may be covertly responding to frustrations associated with trusting the OEMs and Retailers to deliver on their part of the supply chain. I know as a consumer, I’m very frustrated with the Windows 8 launch. Aside from the Surface sales, there’s nothing happening at the retail level. Let me back up and explain. Over the weekend I visited a number of stores in hopes of trying out various Windows 8 devices. Out of three retailers (Staples, Best Buy, and Future Shop), not *one* met my expectations. Let me be honest with you Staples, I never really have high expectations from your computer department. If I need paper or pens, whatever, but computers – you’re not the top of my list for price or selection. Still, considering you flaunted Win 8 devices in your flyer I expected *something* – some sign of effort that you took the Windows 8 launch seriously. As I entered the 1910 Pembina Highway location in Winnipeg, there was nothing – no signage, no banners – nothing that would suggest Windows 8 had even launched. I made my way to the laptops. I had to play with each machine to determine which ones were running Windows 8. There wasn’t anything on the placards that made it obvious which were Windows 8 machines and which ones were Windows 7. Likewise, there was no easy way to identify the touch screen laptop (the HP model) from the others without physically touching the screen to verify. Horrible experience. In the same mall as the Staples I mentioned above, there’s a Future Shop. Surely they would be more on the ball. I walked in to the 1910 Pembina Highway location and immediately realized I would not get a better experience. Except for the sign by the front door mentioning Windows 8, there was *nothing* in the computer department pointing you to the Windows 8 devices. Like in Staples, the Win 8 laptops were mixed in with the Win 7 ones and there was nothing notable calling out which ones were running Win 8. I happened to hit up the St. James Street location today, thinking since its a busier store they must have more options. To their credit, they did have two staff members decked out in Windows 8 shirts and who were helping a customer understand Windows 8. But otherwise, there was nothing highlighting the Windows 8 devices and they were again mixed in with the rest of the Win 7 machines. Finally, we have the St. James Street Best Buy location here in Winnipeg. I’m sure Best Buy will have their act together. Nope, not even close. Same story as the others: minimal signage (there was a sign as you walked in with a link to this schedule of demo days), Windows 8 hardware mixed with the rest of the PC offerings, and no visible call-outs identifying which were Win 8 based. This meant that, like Future Shop and Staples, if you wanted to know which machine had Windows 8 you had to go and scrutinize each machine. Also, there was nothing identifying which ones were touch based and which were not. Just Another Day… To these retailers, it seemed that the Windows 8 launch was just another day, with another product to add to the showroom floor. Meanwhile, Apple has their dedicated areas *in all three stores*. It was dead simple to find where the Apple products were compared to the Windows 8 products. No wonder Microsoft is starting to push their own retail stores. No wonder Microsoft is trying to funnel orders through them instead of relying on these bloated retail big box stores who obviously can’t manage a product launch. It’s Not Just The Retailers… Remember when the Acer CEO, Founder, and President of Computer Global Operations all weighed in on how Microsoft releasing the Surface would have a “huge negative impact for the ecosystem and other brands may take a negative reaction”? Also remember the CEO stating “[making hardware] is not something you are good at so please think twice”? Well the launch day has come and gone, and so far Microsoft is the only one that delivered on having hardware available on the October 26th date. Oh sure, there are laptops running Windows 8 – but all in one desktop PCs? I’ve only seen one or two! And tablets are *non existent*, with some showing an early to late November availability on Best Buy’s website! So while the retailers could be doing more to make it easier to find Windows 8 devices, the manufacturers could help by *getting devices into stores*! That’s supposedly something that these companies are good at, according to the Acer CEO. So Here’s What the Retailers and Manufacturers Need To Do… Get Product Out The pivotal timeframe will be now to the end of November. We need to start seeing all these fantastic pieces of hardware ship – including the Samsung ATIV Smart PC Pro, the Acer Iconia, the Asus TAICHI 21, and the sexy Samsung Series 7 27” desktop. It’s not enough to see product announcements, we need to see actual devices. Make It Easy For Customers To Find Win8 Devices You want to make it easy to sell these things? Make it easy for people to find them! Have staff on hand that really know how these devices run and what can be done with them. Don’t just have a single demo day, have people who can demo it every day! Make It Easy to See the Features There’s touch screen desktops, touch screen laptops, tablets, non-touch laptops, etc. People need to easily find the features for each machine. If I’m looking for a touch-laptop, I shouldn’t need to sift through all the non-touch laptops to find them – at the least, I need to quickly be able to see which ones are touch. I feel silly even typing this because this should be retail 101 and I have no retail background (but I do have an extensive background as a customer). In Summary… Microsoft launching the Surface and selling them through their own channels isn’t slapping its OEM and retail partners in the face; its slapping them to wake the hell up and stop coasting through Windows launch events like they don’t matter. Unless I see some improvements from vendors and retailers in November, I may just hold onto my money for a Surface Pro even if I have to wait until early 2013. Your move OEM/Retailers. *Update – While my experience has been in Winnipeg, similar experiences have been voiced from colleagues in Calgary and Edmonton.

    Read the article

  • Thoughts on Build 2013

    - by D'Arcy Lussier
    Originally posted on: http://geekswithblogs.net/dlussier/archive/2013/06/30/153294.aspxAnd so another Build conference has come to an end. Below are my thoughts/perspectives on various aspects of the event. I’ll do a separate blog post on my thoughts of the Build message for developers. The Good Moscone center was a great venue for Build! Easy to get around, easy to get to, and well maintained, it was a very comfortable conference venue. Yeah, the free swag was nice. Build has built up an expectation that attendees will always get something; it’ll be interesting to see how Microsoft maintains this expectation over the next few Build events. I still maintain that free swag should never be the main reason one attends an event, and for me this was definitely just an added bonus. I’m planning on trying to use the Surface as a dedicated 2nd device at work for meetings, I’ll share my experiences over the next few months. The hackathon event was a great idea, although personally I couldn’t justify spending the money on a conference registration just to spend the entire conference coding. Still, the apps that were created were really great and there was a lot of passion and excitement around the hackathon. I wonder if they couldn’t have had the hackathon on the Monday/Tuesday for those that wanted to participate so they didn’t miss any of the actual conference over Wed/Thurs. San Francisco was a great city to host Build. Getting from hotels to the conference center was very easy (well especially for me, I was only 3 blocks away) and the city itself felt very safe. However, if I never have to fly into SFO again I’ll be alright with that! Delays going into and out of SFO and both apparently were due to the airport itself. The Bad Build is one of those oddities on the conference landscape where people will pay to commit to attending an event without knowing anything about the sessions. We got our list of conference sessions when we registered on Tuesday, not before. And even then, we only got titles and not descriptions (those were eventually made available via the conference’s mobile application). I get it…they’re going to make announcements and they don’t want to give anything away through the session titles. But honestly, there wasn’t anything in the session titles that I would have considered a surprise. Breakfasts were brutal. High-carb pastries, donuts, and muffins with fruit and hard boiled eggs does not a conference breakfast make. I can’t believe that the difference between a continental breakfast per person and a hot breakfast buffet would have been a huge impact to a conference fee that was already around $2000. The vendor area was anemic. I don’t know why Microsoft forces the vendors into cookie-cutter booth areas (this year they were all made of plywood material). WPC, TechEd – booth areas there allow the vendors to be creative with their displays. Not so much for Build. Really odd was the lack of Microsoft’s own representation around Bing. In the day 1 keynote Microsoft made a big deal about Bing as an API. Yet there was nobody in the vendor area set up to provide more information or have discussions with about the Bing API. The Ugly Our name badges were NFC enabled. The purpose of this, beyond the vendors being able to scan your info, wasn’t really made clear. An attendee I talked to showed how you could get a reader app on your phone so you can scan other members cards and collect their contact info – which is a kewl idea; business cards are so 1990’s. But I was *shocked* at the amount of information that was on our name badges! Here’s what’s displayed on our name badge: - Name - Company - Twitter Handle I’m ok with that. But here’s what actually gets read: - Name - Company - Address Used for Registration - Phone Number Used for Registration So sharing that info with another attendee, they get way more of my info than just how to find me on Twitter! Microsoft, you need to fix this for the future. If vendors want to collect information on attendees, they should be able to collect an ID from the badge, then get a report with corresponding records afterwards. My personal information should not be so readily available, and without my knowledge! Final Verdict Maybe its my older age, maybe its where I’m at in life with family, maybe its where I’m at in my career, but when I consider whether a conference experience was valuable I get to the core reason I attend: opportunities to learn, opportunities to network, opportunities to engage with Microsoft. Opportunities to Learn:  Sessions I attended were generally OK, with some really stand out ones on Day 2. I would love to see Microsoft adopt the Dojo format for a portion of their sessions. Hands On Labs are dull, lecture style sessions are great for information sharing. But a guided hands-on coding session (Read: Dojo) provides the best of both worlds. Being that all content is publically available online to everyone (Build attendee or not), the value of attending the conference sessions is decreased. The value though is in the discussions that take part in person afterwards, which leads to… Opportunities to Network: I enjoyed getting together with old friends and connecting with Twitter friends in person for the first time. I also had an opportunity to meet total strangers. So from a networking perspective, Build was fantastic! I still think it would have been great to have an area for ad-hoc discussions – where speakers could announce they’d be available for more questions after their sessions, or attendees who wanted to discuss more in depth on a topic with other attendees could arrange space. Some people have no problems being outgoing and making these things happen, but others are not and a structured model is more attractive. Opportunities to Engage with Microsoft: Hit and miss on this one. Outside of the vendor area, unless you cornered or reached out to a speaker, there wasn’t any defined way to connect with blue badges. And as I mentioned above, Microsoft didn’t have full representation in the vendor area (no Bing). All in all, Build was a fun party where I was informed about some new stuff and got some free swag. Was it worth the time away from home and the hit to my PD budget? I’d say Somewhat. Build is a great informational conference, but I wouldn’t call it a learning conference. Considering that TechEd seems to be moving to more of an IT Pro focus, independent developer conferences seem to be the best value for those looking to learn and not just be informed. With the rapid development cycle Microsoft is embracing, we’re already seeing Build happening twice within a 12 month period. If that continues, the value of attending Build in person starts to diminish – especially with so much content available online. If Microsoft wants Build to be a must-attend event in the future, they need to start incorporating aspects of Tech Ed, past PDCs, and other conferences so those that want to leave with more than free swag have something to attract them.

    Read the article

  • Microsoft Silverlight cannot be used in browsers running in 64 bit mode.

    - by JD
    Hi, I have taken ownership of an application which is a .net win forms application which hosts a System.Windows.Forms.WebBrowser. When the application launches it makes a http request to load a xap file. I immediately see "install silverlight" icon and on clicking get: "Microsoft Silverlight cannot be used in browsers running in 64 bit mode". The app was written initially on a 32 bit machine although I have a 64 bit machine. Any ideas what I need to tweak to get this running? JD

    Read the article

  • How did Microsoft create assemblies that have circular references?

    - by Drew Noakes
    In the .NET BCL there are circular references between: System.dll and System.Xml.dll System.dll and System.Configuration.dll System.Xml.dll and System.Configuration.dll Here's a screenshot from .NET Reflector that shows what I mean: How Microsoft created these assemblies is a mystery to me. Is a special compilation process required to allow this? I imagine something interesting is going on here.

    Read the article

  • Microsoft Chart control converts \n in file names to newline characters.

    - by xpda
    I am using a Microsoft Chart control (system.windows.forms.datavisualization.charting.chart) in a Windows forms application, vb.net 2008. I use folder paths for the x values in a pie chart. Chart control converts a name like c:\newfolder into c:[newline]ewfolder. I tried adding a slash, making it c:\\newfolder, but this only changes it to c:\[newline]ewfolder. Is there a workaround for this behavior?

    Read the article

  • What is the new name of Microsoft.Data.Entity.Ctp?

    - by Anonymous Coward
    Hello Everyone I'm playing around with Entity Framework 4 and code only. The tutorial I'm following is using the Beta-Version of Visual Studio 2010 and is referring to Microsoft.Data.Entity.Ctp. Since I'm working with the final release of Visual Studio the name of the dll must have changed. Can somebody tell me how its name is now? Cheers, AC

    Read the article

  • How to input live video Stream to Microsoft Encoder.

    - by GautamB
    There is a facility in Microsoft Encoder to give input from a file or from a device such as webcam for streaming. But i haven't found a way to give live video stream to encoder. i.e. How to make encoder listen to particular UDP port. From which encoder will take input stream and encode it and push to windows media server. Any help will be appreciated. Thanks, Gautam B.

    Read the article

  • Getting an 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine error when I tried hosting on Azure Servers

    - by user1864888
    I am hosting a web application which uses OLEDB 12 to read an Excel file, but getting the error "'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine". Wonder if I should use lower version of OLEDB or Azure does not support OLEDB services at all. Before rewriting the program, just wanted to see if anyone knows a better alternative. I am using Visual Studio 2010 for my development. Thanks in Advance.

    Read the article

  • What can we do to make Microsoft add IntelliTrace to VS 2010 Professional Edition?

    - by Ikaso
    Now that Microsoft has released VS 2010 I went to the product page here. To my amazement I found out that IntelliTrace(Historical Debugger) is supported only on the Ultimate Edition of VS 2010. This mean that you have to spend almost $4000 for renewal and almost $12000 for a new license. Does someone have any idea how can we change this decision? Especially make them add this feature to VS 2010 Professional Edition.

    Read the article

  • Microsoft Windows 64-bit application development best practises installation folder.

    - by abmv
    My problem is that a vendor is providing me with a 64bit application (packed in a 64bit installer) but it goes and installs to the x86 (Program Files) Folder and he keeps telling me its OK but I want it to install in the Program Files directory; as the 32 bit version does that and scripts for the app are developed based on this assumption. Can someone direct me to the Microsoft recommended best practices for 64bit applications(links). Thanks in advance.

    Read the article

  • JavaScript method to write to Microsoft Visual Web Developer Debugger?

    - by Josh
    I generally test my web apps with Firefox and use Firebug. I love Firebug. But when I'm testing JavaScript code in IE I use the debugger in Microsoft's Visual Web Developer 2008 Express Edition. I would love to have an equivalent to Firebug's console.log methods which would allow me to log messages to Visual Web Developer. Any way to log messages to the error list/messages list/output pane using JavaScript?

    Read the article

< Previous Page | 263 264 265 266 267 268 269 270 271 272 273 274  | Next Page >