Search Results

Search found 1941 results on 78 pages for 'infrastructure'.

Page 48/78 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Windows Azure: Major Updates for Mobile Backend Development

    - by ScottGu
    This week we released some great updates to Windows Azure that make it significantly easier to develop mobile applications that use the cloud. These new capabilities include: Mobile Services: Custom API support Mobile Services: Git Source Control support Mobile Services: Node.js NPM Module support Mobile Services: A .NET API via NuGet Mobile Services and Web Sites: Free 20MB SQL Database Option for Mobile Services and Web Sites Mobile Notification Hubs: Android Broadcast Push Notification Support All of these improvements are now available to use immediately (note: some are still in preview).  Below are more details about them. Mobile Services: Custom APIs, Git Source Control, and NuGet Windows Azure Mobile Services provides the ability to easily stand up a mobile backend that can be used to support your Windows 8, Windows Phone, iOS, Android and HTML5 client applications.  Starting with the first preview we supported the ability to easily extend your data backend logic with server side scripting that executes as part of client-side CRUD operations against your cloud back data tables. With today’s update we are extending this support even further and introducing the ability for you to also create and expose Custom APIs from your Mobile Service backend, and easily publish them to your Mobile clients without having to associate them with a data table. This capability enables a whole set of new scenarios – including the ability to work with data sources other than SQL Databases (for example: Table Services or MongoDB), broker calls to 3rd party APIs, integrate with Windows Azure Queues or Service Bus, work with custom non-JSON payloads (e.g. Windows Periodic Notifications), route client requests to services back on-premises (e.g. with the new Windows Azure BizTalk Services), or simply implement functionality that doesn’t correspond to a database operation.  The custom APIs can be written in server-side JavaScript (using Node.js) and can use Node’s NPM packages.  We will also be adding support for custom APIs written using .NET in the future as well. Creating a Custom API Adding a custom API to an existing Mobile Service is super easy.  Using the Windows Azure Management Portal you can now simply click the new “API” tab with your Mobile Service, and then click the “Create a Custom API” button to create a new Custom API within it: Give the API whatever name you want to expose, and then choose the security permissions you’d like to apply to the HTTP methods you expose within it.  You can easily lock down the HTTP verbs to your Custom API to be available to anyone, only those who have a valid application key, only authenticated users, or administrators.  Mobile Services will then enforce these permissions without you having to write any code: When you click the ok button you’ll see the new API show up in the API list.  Selecting it will enable you to edit the default script that contains some placeholder functionality: Today’s release enables Custom APIs to be written using Node.js (we will support writing Custom APIs in .NET as well in a future release), and the Custom API programming model follows the Node.js convention for modules, which is to export functions to handle HTTP requests. The default script above exposes functionality for an HTTP POST request. To support a GET, simply change the export statement accordingly.  Below is an example of some code for reading and returning data from Windows Azure Table Storage using the Azure Node API: After saving the changes, you can now call this API from any Mobile Service client application (including Windows 8, Windows Phone, iOS, Android or HTML5 with CORS). Below is the code for how you could invoke the API asynchronously from a Windows Store application using .NET and the new InvokeApiAsync method, and data-bind the results to control within your XAML:     private async void RefreshTodoItems() {         var results = await App.MobileService.InvokeApiAsync<List<TodoItem>>("todos", HttpMethod.Get, parameters: null);         ListItems.ItemsSource = new ObservableCollection<TodoItem>(results);     }    Integrating authentication and authorization with Custom APIs is really easy with Mobile Services. Just like with data requests, custom API requests enjoy the same built-in authentication and authorization support of Mobile Services (including integration with Microsoft ID, Google, Facebook and Twitter authentication providers), and it also enables you to easily integrate your Custom API code with other Mobile Service capabilities like push notifications, logging, SQL, etc. Check out our new tutorials to learn more about to use new Custom API support, and starting adding them to your app today. Mobile Services: Git Source Control Support Today’s Mobile Services update also enables source control integration with Git.  The new source control support provides a Git repository as part your Mobile Service, and it includes all of your existing Mobile Service scripts and permissions. You can clone that git repository on your local machine, make changes to any of your scripts, and then easily deploy the mobile service to production using Git. This enables a really great developer workflow that works on any developer machine (Windows, Mac and Linux). To use the new support, navigate to the dashboard for your mobile service and select the Set up source control link: If this is your first time enabling Git within Windows Azure, you will be prompted to enter the credentials you want to use to access the repository: Once you configure this, you can switch to the configure tab of your Mobile Service and you will see a Git URL you can use to use your repository: You can use this URL to clone the repository locally from your favorite command line: > git clone https://scottgutodo.scm.azure-mobile.net/ScottGuToDo.git Below is the directory structure of the repository: As you can see, the repository contains a service folder with several subfolders. Custom API scripts and associated permissions appear under the api folder as .js and .json files respectively (the .json files persist a JSON representation of the security settings for your endpoints). Similarly, table scripts and table permissions appear as .js and .json files, but since table scripts are separate per CRUD operation, they follow the naming convention of <tablename>.<operationname>.js. Finally, scheduled job scripts appear in the scheduler folder, and the shared folder is provided as a convenient location for you to store code shared by multiple scripts and a few miscellaneous things such as the APNS feedback script. Lets modify the table script todos.js file so that we have slightly better error handling when an exception occurs when we query our Table service: todos.js tableService.queryEntities(query, function(error, todoItems){     if (error) {         console.error("Error querying table: " + error);         response.send(500);     } else {         response.send(200, todoItems);     }        }); Save these changes, and now back in the command line prompt commit the changes and push them to the Mobile Services: > git add . > git commit –m "better error handling in todos.js" > git push Once deployment of the changes is complete, they will take effect immediately, and you will also see the changes be reflected in the portal: With the new Source Control feature, we’re making it really easy for you to edit your mobile service locally and push changes in an atomic fashion without sacrificing ease of use in the Windows Azure Portal. Mobile Services: NPM Module Support The new Mobile Services source control support also allows you to add any Node.js module you need in the scripts beyond the fixed set provided by Mobile Services. For example, you can easily switch to use Mongo instead of Windows Azure table in our example above. Set up Mongo DB by either purchasing a MongoLab subscription (which provides MongoDB as a Service) via the Windows Azure Store or set it up yourself on a Virtual Machine (either Windows or Linux). Then go the service folder of your local git repository and run the following command: > npm install mongoose This will add the Mongoose module to your Mobile Service scripts.  After that you can use and reference the Mongoose module in your custom API scripts to access your Mongo database: var mongoose = require('mongoose'); var schema = mongoose.Schema({ text: String, completed: Boolean });   exports.get = function (request, response) {     mongoose.connect('<your Mongo connection string> ');     TodoItemModel = mongoose.model('todoitem', schema);     TodoItemModel.find(function (err, items) {         if (err) {             console.log('error:' + err);             return response.send(500);         }         response.send(200, items);     }); }; Don’t forget to push your changes to your mobile service once you are done > git add . > git commit –m "Switched to use Mongo Labs" > git push Now our Mobile Service app is using Mongo DB! Note, with today’s update usage of custom Node.js modules is limited to Custom API scripts only. We will enable it in all scripts (including data and custom CRON tasks) shortly. New Mobile Services NuGet package, including .NET 4.5 support A few months ago we announced a new pre-release version of the Mobile Services client SDK based on portable class libraries (PCL). Today, we are excited to announce that this new library is now a stable .NET client SDK for mobile services and is no longer a pre-release package. Today’s update includes full support for Windows Store, Windows Phone 7.x, and .NET 4.5, which allows developers to use Mobile Services from ASP.NET or WPF applications. You can install and use this package today via NuGet. Mobile Services and Web Sites: Free 20MB Database for Mobile Services and Web Sites Starting today, every customer of Windows Azure gets one Free 20MB database to use for 12 months free (for both dev/test and production) with Web Sites and Mobile Services. When creating a Mobile Service or a Web Site, simply chose the new “Create a new Free 20MB database” option to take advantage of it: You can use this free SQL Database together with the 10 free Web Sites and 10 free Mobile Services you get with your Windows Azure subscription, or from any other Windows Azure VM or Cloud Service. Notification Hubs: Android Broadcast Push Notification Support Earlier this year, we introduced a new capability in Windows Azure for sending broadcast push notifications at high scale: Notification Hubs. In the initial preview of Notification Hubs you could use this support with both iOS and Windows devices.  Today we’re excited to announce new Notification Hubs support for sending push notifications to Android devices as well. Push notifications are a vital component of mobile applications.  They are critical not only in consumer apps, where they are used to increase app engagement and usage, but also in enterprise apps where up-to-date information increases employee responsiveness to business events.  You can use Notification Hubs to send push notifications to devices from any type of app (a Mobile Service, Web Site, Cloud Service or Virtual Machine). Notification Hubs provide you with the following capabilities: Cross-platform Push Notifications Support. Notification Hubs provide a common API to send push notifications to iOS, Android, or Windows Store at once.  Your app can send notifications in platform specific formats or in a platform-independent way.  Efficient Multicast. Notification Hubs are optimized to enable push notification broadcast to thousands or millions of devices with low latency.  Your server back-end can fire one message into a Notification Hub, and millions of push notifications can automatically be delivered to your users.  Devices and apps can specify a number of per-user tags when registering with a Notification Hub. These tags do not need to be pre-provisioned or disposed, and provide a very easy way to send filtered notifications to an infinite number of users/devices with a single API call.   Extreme Scale. Notification Hubs enable you to reach millions of devices without you having to re-architect or shard your application.  The pub/sub routing mechanism allows you to broadcast notifications in a super-efficient way.  This makes it incredibly easy to route and deliver notification messages to millions of users without having to build your own routing infrastructure. Usable from any Backend App. Notification Hubs can be easily integrated into any back-end server app, whether it is a Mobile Service, a Web Site, a Cloud Service or an IAAS VM. It is easy to configure Notification Hubs to send push notifications to Android. Create a new Notification Hub within the Windows Azure Management Portal (New->App Services->Service Bus->Notification Hub): Then register for Google Cloud Messaging using https://code.google.com/apis/console and obtain your API key, then simply paste that key on the Configure tab of your Notification Hub management page under the Google Cloud Messaging Settings: Then just add code to the OnCreate method of your Android app’s MainActivity class to register the device with Notification Hubs: gcm = GoogleCloudMessaging.getInstance(this); String connectionString = "<your listen access connection string>"; hub = new NotificationHub("<your notification hub name>", connectionString, this); String regid = gcm.register(SENDER_ID); hub.register(regid, "myTag"); Now you can broadcast notification from your .NET backend (or Node, Java, or PHP) to any Windows Store, Android, or iOS device registered for “myTag” tag via a single API call (you can literally broadcast messages to millions of clients you have registered with just one API call): var hubClient = NotificationHubClient.CreateClientFromConnectionString(                   “<your connection string with full access>”,                   "<your notification hub name>"); hubClient.SendGcmNativeNotification("{ 'data' : {'msg' : 'Hello from Windows Azure!' } }", "myTag”); Notification Hubs provide an extremely scalable, cross-platform, push notification infrastructure that enables you to efficiently route push notification messages to millions of mobile users and devices.  It will make enabling your push notification logic significantly simpler and more scalable, and allow you to build even better apps with it. Learn more about Notification Hubs here on MSDN . Summary The above features are now live and available to start using immediately (note: some of the services are still in preview).  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Multiple Silverlight Unit Test Projects in Solution

    - by IUnknown
    I am building out a number of Silverlight 4.0 libraries that are part of the same solution. I like to break them into separate projects and have a Unit Test project for each: SolutionX -LibraryProject1 ---Class1.cs ---Class2.cs -LibraryProject1.Test ---Tests1.cs ---Tests2.cs -LibraryProject2 ---Class1.cs ---Class2.cs ---CLass3.cs -LibraryProject2.Test ---Tests1.cs ---Tests2.cs ---Tests3.cs -LibraryProject3 ---Class1.cs -LibraryProject3.Test ---Tests1.cs This works great when using VS regular test projects and infrastructure because I can create and execute list of test that are aggregated from each Test project. But with the Silverlight Unit Test Framework since the Silverlight Unit Test Project must be the "start up project" I cannot figure how to run a collection of tests from each test project in one go. I have to run each separately then switch the starting project each time. I would prefer to avoid create complex build scripts or build definitions - is there a way to run all the tests at once? -Thanks

    Read the article

  • XAML Parsing Exception

    - by e28Makaveli
    I have a simple XAML page that load fine when it is loaded as part of any application within Visual Studio. However, when I deploy this application using ClickOnce, I get the following exception: Type : System.Windows.Markup.XamlParseException, PresentationFramework, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35 Message : Unable to cast object of type 'System.Windows.Controls.Grid' to type 'EMS.Controls.Dictionary.StatusBarControl'. Error at object 'System.Windows.Controls.Grid' in markup file 'EMS.Controls.Dictionary;component/views/statusbarcontrol.xaml'. Source : PresentationFramework Help link : LineNumber : 0 LinePosition : 0 KeyContext : UidContext : NameContext : BaseUri : pack://application:,,,/EMS.Controls.Dictionary;component/views/statusbarcontrol.xaml Data : System.Collections.ListDictionaryInternal TargetSite : Void ThrowException(System.String, System.Exception, Int32, Int32, System.Uri, System.Windows.Markup.XamlObjectIds, System.Windows.Markup.XamlObjectIds, System.Type) Stack Trace : at System.Windows.Markup.XamlParseException.ThrowException(String message, Exception innerException, Int32 lineNumber, Int32 linePosition, Uri baseUri, XamlObjectIds currentXamlObjectIds, XamlObjectIds contextXamlObjectIds, Type objectType) at System.Windows.Markup.XamlParseException.ThrowException(ParserContext parserContext, Int32 lineNumber, Int32 linePosition, String message, Exception innerException) at System.Windows.Markup.BamlRecordReader.ReadRecord(BamlRecord bamlRecord) at System.Windows.Markup.BamlRecordReader.Read(Boolean singleRecord) at System.Windows.Markup.TreeBuilderBamlTranslator.ParseFragment() at System.Windows.Markup.TreeBuilder.Parse() at System.Windows.Markup.XamlReader.LoadBaml(Stream stream, ParserContext parserContext, Object parent, Boolean closeStream) at System.Windows.Application.LoadComponent(Object component, Uri resourceLocator) at EMS.Controls.Dictionary.StatusBarControl.InitializeComponent() at EMS.Controls.Dictionary.StatusBarControl..ctor(IDataView content) at OCC600.ReportManager.ReportPresenter.ShowQueryView(Object arg, Boolean bringForward, Type selectedDataType) at OCC600.ReportManager.ReportPresenter..ctor(IUnityContainer container) at OCC600.ReportManager.Module.Initialize() at Microsoft.Practices.Composite.Modularity.ModuleLoader.Initialize(ModuleInfo[] moduleInfos) Inner Exception --------------- Type : System.InvalidCastException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089 Message : Unable to cast object of type 'System.Windows.Controls.Grid' to type 'EMS.Controls.Dictionary.StatusBarControl'. Source : EMS.Controls.Dictionary Help link : Data : System.Collections.ListDictionaryInternal TargetSite : Void System.Windows.Markup.IComponentConnector.Connect(Int32, System.Object) Stack Trace : at EMS.Controls.Dictionary.StatusBarControl.System.Windows.Markup.IComponentConnector.Connect(Int32 connectionId, Object target) at System.Windows.Markup.BamlRecordReader.ReadConnectionId(BamlConnectionIdRecord bamlConnectionIdRecord) at System.Windows.Markup.BamlRecordReader.ReadRecord(BamlRecord bamlRecord) The XAML page is given below: xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:cdic="clr-namespace:EMS.Controls.Dictionary.Primitives" xmlns:dicutil="clr-namespace:OCC600.Infrastructure.Dictionary.Utility;assembly=EMS.Infrastructure.Dictionary" Loaded="ResultSetControl_Loaded" <StatusBarItem Margin="10,0, 10, 0"> <TextBlock Text="{Binding CountText}" Padding="5,0"/> </StatusBarItem> <StatusBarItem Margin="10,0"> <TextBlock Text="{Binding MemoryUsageText}" Padding="5,0"/> </StatusBarItem> <StatusBarItem Margin="10,0" MaxWidth="400"> <TextBlock Text="{Binding StatusReport.Summary}" Padding="5,0" /> </StatusBarItem> <ProgressBar Margin="20,0" Name="progBar" Width="150" Height="13" Visibility="Collapsed" > <ProgressBar.ContextMenu> <ContextMenu Name="ctxMenu" ItemsSource="{Binding ActiveWorkItems}" Visibility="{Binding Path=ActiveWorkItems.HasItems, Converter={StaticResource BooToVisConv}}"> <ContextMenu.ItemContainerStyle> <Style TargetType="{x:Type MenuItem}"> <Setter Property="Template"> <Setter.Value> <ControlTemplate TargetType="{x:Type MenuItem}"> <StackPanel Height="20" Margin="10,0" Orientation="Horizontal" HorizontalAlignment="Left"> <TextBlock Text="{Binding Path=Name, Mode=OneTime}" Foreground="Black" VerticalAlignment="Center" HorizontalAlignment="Left" /> <ToggleButton Style="{StaticResource vistaGoldenToggleButtonStyle}" Padding="5,0" Content="Cancel" IsChecked="{Binding Cancel}" Margin="10,0,0,0" > </ToggleButton> </StackPanel> </ControlTemplate> </Setter.Value> </Setter> </Style> </ContextMenu.ItemContainerStyle> </ContextMenu> </ProgressBar.ContextMenu> </ProgressBar> <StatusBarItem Margin="10,0" MaxWidth="400" HorizontalAlignment="Right"> <StackPanel Orientation="Horizontal"> <TextBlock Text="Last Update:" Padding="5,0" /> <TextBlock Text="{Binding TimeStamp}" Padding="5,0" /> </StackPanel> </StatusBarItem> <!-- TODO: Put checkmark if all is well, or error if connection failed--> <StatusBarItem Style="{DynamicResource {ComponentResourceKey TypeInTargetAssembly=dc:Ribbon, ResourceId=StatusBarItemAlt}}" DockPanel.Dock="Right" Padding="6,0,32,0" > <cdic:SplitButton Margin="5,0" Padding="5,2" Style="{DynamicResource {ComponentResourceKey TypeInTargetAssembly={x:Type cdic:SplitButtonResources}, ResourceId=vistaSplitButtonStyle}}" Mode="Split"> <cdic:SplitButton.ContextMenu> <ContextMenu > <MenuItem Header="Refresh Now" Command="{Binding ToggleConnectivityCmd}" CommandParameter="false"/> <MenuItem IsCheckable="True" IsChecked="{Binding ConnectState, Converter={StaticResource isFailedConverter}}" CommandParameter="{Binding RelativeSource={x:Static RelativeSource.Self}, Path=IsChecked}" Header="Work Offline" Command="{Binding ToggleConnectivityCmd}"/> </ContextMenu> </cdic:SplitButton.ContextMenu> <cdic:SplitButton.Content> <StackPanel Orientation="Horizontal"> <Image x:Name="img" Source="{Binding ConnectState, Converter={StaticResource imageConverter}}" Width="16" Height="16" HorizontalAlignment="Center" VerticalAlignment="Center"/> <TextBlock Text="{Binding ConnectState}" Padding="3,0,0,0"/> </StackPanel> </cdic:SplitButton.Content> </cdic:SplitButton> </StatusBarItem> </StatusBar> </Grid> The error just seems to have come out of no where. Any ideas? TIA.

    Read the article

  • DataContractSerializer KnownType attribute not being respected?

    - by e28Makaveli
    I have a class that is decorated with a KnownType attribute with a type of the class. Is this not allowed? [KnownType(typeof(Occ600UIConfig))] public class Occ600UIConfig { } If so, why is the DCS throwing the following exception? {"Error in line 1 position 387. Element 'http://schemas.microsoft.com/2003/10/Serialization/Arrays:Value' contains data of the 'http://schemas.datacontract.org/2004/07/OCC600.Infrastructure.Dictionary.BusinessEntities:Occ600UIConfig' data contract. The deserializer has no knowledge of any type that maps to this contract. Add the type corresponding to 'Occ600UIConfig' to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding it to the list of known types passed to DataContractSerializer."}

    Read the article

  • Creating a Wiki site in SharePoint 2010

    - by Ben
    I have a SharePoint 2010 site set up and would like to add a Wiki site to it. Normally, to add a subsite to a parent site you would click "Site Actions" - "New Site" and choose the site type (e.g. Team Site) and it would work fine. In the case of adding a Wiki site, I choose "Enterprise Wiki", and get "Error An Unexpected Error has occured Correlation Id:...". So i took the CorrelationId and checked in the ULS files to see what the error is, and the first bad sign seems to be: SharePoint Foundation Feature Infrastructure " The element of type 'ContentTypeBinding' for feature 'EnterpriseWiki' (id: 76d688ad-c16e-4cec-9b71-7b7f0d79b9cd) threw an exception during activation: Specified argument was out of the range of valid values. Parameter name: Content type not found (Id: '0x010100C568DB52D9D0A14D9B2FDCC96666E9F2007948130EC3DB064584E219954237AF39004C1F8B46085B4D22B1CDC3DE08CFFB9C')." There are more errors in the log, but this seems to be the first one. Any ideas on whats going on? Thanks

    Read the article

  • Upgraded to EF6 blew up Universal provider session state for Azure

    - by Ryan
    I have an ASP.NET MVC 4 application that using the Universal providers for session state: <sessionState mode="Custom" sqlConnectionString="DefaultConnection" customProvider="DefaultSessionProvider"> <providers> <add name="DefaultSessionProvider" type="System.Web.Providers.DefaultSessionStateProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DefaultConnection" /> </providers> </sessionState> When I upgraded to entity framework 6 I now get this error: Method not found: 'System.Data.Objects.ObjectContext System.Data.Entity.Infrastructure.IObjectContextAdapter.get_ObjectContext()'. I tried adding the reference to System.Data.Entity.dll back in but that didn't work and I know that your not suppose to add that with the new entity framework..

    Read the article

  • Splitting assemblies - finding the balance (avoiding overkill)

    - by M.A. Hanin
    I'm writing a wide component infrastructure, to be used in my projects. Since not all projects will require every component created, I've been thinking of splitting the component into discrete assemblies, so that every application developed will only be deployed with the required assemblies. I assume that creating an assembly has some storage overhead (the assembly's code, wrapping whatever is inside). Therefore, there must be some limit to the advantage gained by splitting an assembly - a certain point where splitting the assembly is worse than keeping it united (storage-wise and performance-wise). Now, here is the question: how do I know when splitting an assembly is an overkill? P.S I guess there are other overheads to assembly splitting, aside from the storage overhead. If anyone can point out these overheads, it would be much appreciated.

    Read the article

  • Subsume external library into source tree with Autotools.

    - by troutwine
    I am developing a new project, using Autotools for my build infrastructure. I would like to subsume external dependencies into my source tree. These dependencies are using Autotools, as well. How can I configure my project's build scripts to build and link against subsumed dependencies? Though Duret-Lutz's tutorial is excellent, this situation is only briefly addressed in a few slides. I found his explanation deeply confusing. By adding the directory name of subsumed dependencies to the toplevel Makefile.am's SUBDIRS the dependency is being configured and built. It is possible to manually set include paths through CFLAGS, but how do I link against libtool .la files?

    Read the article

  • DragDrop-Support of PictureBox-Control

    - by Effdee
    After some searching i figured out how dragdrop is implemented for a picturebox. But there is one thing - the (inherited of course) allowdrop property isn't accessible from code or property window of picturebox class. So to make it work i added following line to my form-load: ((Control)pictureBox1).AllowDrop = true; Why do i have to do that? In msdn it says: "This API supports the .NET Framework infrastructure and is not intended to be used directly from your code." Any explanation appreciated and sorry for my grammar ;)

    Read the article

  • Shared Source CLI 4.0?

    - by Paul Alexander
    Microsoft released the Shared Source Common Language Infrastructure (the code previously known as ROTOR) code some years ago basically as a reference implementation of the .NET runtime. While the actual .NET runtime (mscorlib, mscoree, mscorjit, etc.) aren't compiled from the code, debugging them shows that they are remarkably similar and at a minimum share much of the same memory structures. This has been an invaluable resource when debugging tricky system behavior with .NET 2.0 compiled assemblies. Now that 4.0 has been released with major changes to the runtime I'd love to find the reference source for that as well. Microsoft has changed names for the source in the past so I'm either searching for the wrong thing, or it hasn't been released. Is there reference source for a .NET 4.0 compatible runtime?

    Read the article

  • Adding / Removing values for an Email based daily stock value reporter - made inPHP

    - by Dave
    Hi, i'm buidling a very simple email based website that users can, when registering list out all the stocktickers they're intersted in following. The program then on a daily basis goes and fetches that information and sends it out to every user. i have the portion which fetchs the information from the stock websites, but i'm looking for an "infrastructure" that allows: (a) a user to send an email to [email protected] with the subject "Subscribe" and with the body containing stockticker values, (b) user to send email to service@... with subject "unsubscribe" and body containing similar values. Looking for code in php please. Any insights?

    Read the article

  • Storing Configurations into Active Direcotry Application Mode

    - by Khurram Aziz
    I have a network devices polling and do actions kind of app; currently it keeps the configuration (which devices to poll, what kind of device, ip, login, password etc) in the database. My network administrator wants that this information is stored in some LDAP server so that he maintain single store of configuration which he himself can use in other apps/scripts etc. I am looking for some article that walks me through setting up ADAM/AD LDS for storing configuration by authoring custom schema etc and how to setup some authentication infrastructure to protect the data.

    Read the article

  • Easiest way to retrofit retry logic on LINQ to SQL migration to SQL Azure

    - by Pat James
    I have a couple of existing ASP .NET web forms and MVC applications that currently use LINQ to SQL with a SQL Server 2008 Express database on a Windows VPS: one VPS for both IIS and SQL. I am starting to outgrow the VPS's ability to effectively host both SQL and IIS and am getting ready to split them up. I am considering migrating the database to SQL Azure and keeping IIS on the VPS. After doing initial research it sounds like implementing retry logic in the data access layer is a must-do when adopting SQL Azure. I suspect this is even more critical to implement in my situation where IIS will be on a VPS outside of the Azure infrastructure. I am looking for pointers on how to do this with the least effort and impact on my existing code base. Is there a good retry pattern that can be applied once at the LINQ to SQL data access layer, as opposed to having to wrap all of my LINQ to SQL operations in try/catch/wait/retry logic?

    Read the article

  • The difference between traditional DLL and COM DLL.

    - by smwikipedia
    I am currently studying COM. I found that COM DLL is kind of built upon the traditional DLL infrastructure. When we build COM DLLs, we still rely on the traditional DLL export methods to lead us to the internal COM co-classes. If COM is for component reusing at the binary level, I think the traditional DLL can achieve the same thing. They both expose functions, they are both binary, so what's the point of turning to COM approach? Currently, I have the feeling that the traditional DLL expose methods in a "flat" manner, while the COM DLL expose methods in an "OOP" hierarchy manner. And the OOP manner seems to be a better approach. Could this be the reason why COM prevails? Many thanks.

    Read the article

  • Loading and storing encryption keys from a config source

    - by Hassan Syed
    I am writing an application which has an authenticity mechanism, using HMAC-sha1, plus a CBC-blowfish pass over the data for good measure. This requires 2 keys and one ivec. I have looked at Crypto++ but the documentation is very poor (for example the HMAC documentation). So I am going oldschool and use Openssl. Whats the best way to generate and load these keys using library functions and tools ? I don't require a secure-socket therefore a x.509 certificate probably does not make sense, unless, of-course, I am missing something. So, do I need to write my own config file, or is there any infrastructure in openssl for this ? If so, could you direct me to some documentation or examples for this.

    Read the article

  • Silverlight error-handling conventions: There is no relationship between onSilverlightError and Repo

    - by rasx
    When I see the call System.Windows.Browser.HtmlPage.Window.Eval (which is evil) in ReportErrorToDOM (in App.xaml.cs) this shows me that it has no relationship to onSilverlightError. So what kind of JavaScript-based scenario calls onSilverlightError? When will onSilverlightError definitely be needed? What are Silverlight error-handling conventions in general? This is a very important comment by Erik Monk but needs more detail: There are 2 kinds of terminal errors in Silverlight. 1) Managed errors (hit the managed Application_UnhandledException method). Note that some errors may not even get to this point. If the managed infrastructure can't be loaded for some reason (out of memory error maybe...), you won't get this kind of error. Still, if you can get it, you can use a web service (or the CLOG project) to communicate it back to the server. 2) Javascript errors.

    Read the article

  • ASP.NET MVC 3 Linq uppercase or lowercase database search

    - by user1495557
    I need immediate help. ): and i know little english. ASP.NET MVC 3 Linq uppercase or lowercase contains search Example: string metin="baris"; var IcerikAra = (from icerik in Context.dbDokumanEditor join kategori in Context.dbDokumanKategori on icerik.KategoriID equals kategori.KategoriID where icerik.Icerik.toLower().Contains(metin) select new { KategoriID=kategori. KategoriAd=kategori.KategoriAd }).ToList(); Exception: StackTrace: at System.Data.EntityClient.EntityCommandDefinition.ExecuteStoreCommands(EntityCommandentityCommand, CommandBehavior behavior) at System.Data.Objects.Internal.ObjectQueryExecutionPlan.Execute[TResultType](ObjectContext context, ObjectParameterCollection parameterValues) at System.Data.Objects.ObjectQuery`1.GetResults(Nullable`1 forMergeOption) at System.Data.Objects.ObjectQuery`1.System.Collections.Generic.IEnumerable.GetEnumerator() at System.Data.Entity.Internal.Linq.InternalQuery`1.GetEnumerator() at System.Data.Entity.Infrastructure.DbQuery`1.System.Collections.Generic.IEnumerable.GetEnumerator() at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection) at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source) at Plus.Areas.DokumanEditor.Controllers.DokumanController.DokumanIcerikAramaBaslat(String metin) Error Message: An error occurred while executing the command definition. See the inner exception for details. thanks..

    Read the article

  • What are the Pros & Cons of using SQL Azure for existing apps on dedicated servers

    - by Mark Redman
    We currently own our own servers, and rent a rack in a datacentre. Looking at the pricing, scalabilty and SLAs for Azure SQL, I am thinking that it might be viable to only use Azure SQL but continue to use our existing applications on our own servers in a datacentres. This will enable us to not worry about the database and its infrastructure so we can concentrate on building an application server farm with disk storeage for files etc. Our application is quite big and has various windows services and parts of it used unmanaged libraries that may not be feasible in the cloud, so probably coulnt have everything in the Azure cloud. The pros: Reduced Total Cost of ownership (no database servers, no sql server licenses) The Cons: I guess there would be overhead in the transfer of data between the Azure Cloud and our datacentre (ie cloud may be in US and datacentre is in the UK) but would this overhead be usable?

    Read the article

  • SSRS 2008 Snapshotting Security

    - by Holy Christ
    Hi, I'm writing a report that will show data based on the User!UserID built into the SSRS infrastructure. The data is sensitive to the user's department. In addition to these department users, there will be admins that should be able to run for all departments, or have a report parameter to run for a specific department. Ideally, I'd like to use SSRS snapshotting so that users can rerun a report they ran on a previous date. It's important that a user can only view the snapshots he created for his department. My questions are: 1.) Does SSRS snapshotting provide a mechanism to limit viewing snapshots by the user that created them? 2.) Will I need to write two reports, one for the admin and one for the department users? I think I do since there isn't a way to secure report parameters. Thanks!

    Read the article

  • Domain-driven design with Zend

    - by mik
    This question is a continuation of my previous question here http://stackoverflow.com/questions/2122850/zend-models-architecture (big thanks to Bill Karwin). I've made some reading including this article http://weierophinney.net/matthew/archives/202-Model-Infrastructure.html and this question http://stackoverflow.com/questions/373054/how-to-properly-create-domain-using-zend-framework Now I understand, what domain driven design is. But examples are still very simple and poor. They are based on one table and one model. Now, my question is: do they use Domain Model Design in real-world PHP projects? I've been looking for some good documentation about this, but I haven't found anything good enough, that explains how to manage several tables and transfer them to Domain Objects. As long as I know, there is Hibernate library, that has this features in Java, but what should I use in PHP (Zend Framework)?

    Read the article

  • Where and how to validate and map ViewModel?

    - by chobo
    Hi, I am trying to learn Domain Driven Design and recently read that lots of people advocate creating a ViewModels for your views that store all the values you want to display in a given view. My question is how should I do the form validation? should I create separate validation classes for each view, or group them together? I'm also confused on what this would look like in code. This is how I currently think validation and viewmodels fit in to the scheme of things: View (some user input) - Controller - FormValidation(of ViewModel) - (If valid map to ViewModel to Domain Model) - Domain Layer Service - Infrastructure Thanks! P.S. I use Asp.net MVC with C#

    Read the article

  • .Net Logger (Write your own vs log4net/enterprise logger/nlog etc.)

    - by Jack
    I work for an IT department with about 50+ developers. It used to be about 100+ developers but was cut because of the recession. When our department was bigger there was an ambitious effort made to set up a special architecture group. One thing this group decided to do was create our own internal logger. They thought it was such a simple task that we could spend recources and do it ourselves. Now we are having issues with performance and difficulty viewing the logs generated and some employees are frustrated that we are spending recources on infrastructure stuff like this instead of focusing on serving our business and using stuff that already exists like log4net or Enterprise Logger. Can you assist me in listing up reasons why you should not create your own .net logger. Also reasons for why you should are welcome to get a fair point of view :)

    Read the article

  • Create UML diagrams after or before coding?

    - by ajsie
    I can clearly see the benefits of having UML diagrams showing your infrastructure of the application (class names, their members, how they communicate with each other etc). I'm starting a new project right now and have already structured the database (with visual paradigm). I want to use some design patterns to guide me how to code the classes. I wonder, should I code the classes first before I create UML diagram of it (maybe out of the code... seems possible) or should I first create UML diagram and then code (or generate code from the UML, seems possible that too). What are you experiences telling you is the best way?

    Read the article

  • WCF Business logic handling

    - by Raj
    I have a WCF service that supports about 10 contracts, we have been supporting a client with all the business rules specific to this client now we have another client who will be using the exact same contracts (so we cannot change that) they will be calling the service exactly the same way the previous client called now the only way we can differentiate between the two clients is by one of the input parameters. Based on this input parameter we have to use a slightly different business logic – the logic for both the Client will be same 50% of the time the remainder will have different logic (across Business / DAL layers) . I don’t want to use if else statement in each of contract implementation to differentiate and reroute the logic also what if another client comes in. Is there a clean way of handling a situation like this. I am using framework 3.5. Like I said I cannot change any of the contracts (service / data contract ) or the current service calling infrastructure for the new client. Thanks

    Read the article

  • GAE and Socket Data

    - by Vinod
    I have a field device which keeps on sending data over to any designated port using sockets. I am planning to use GAE for server-side infrastructure. I read GAE does not support sockets. But i can configure the device to send the data over port 80. so we wrote a genericservlet to capture this data on GAE. But it is not obtaining any values from the client. any suggestions to fix this issue?

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >