Search Results

Search found 46073 results on 1843 pages for 'rating system'.

Page 310/1843 | < Previous Page | 306 307 308 309 310 311 312 313 314 315 316 317  | Next Page >

  • What’s new in ASP.NET 4.0: Core Features

    - by Rick Strahl
    Microsoft released the .NET Runtime 4.0 and with it comes a brand spanking new version of ASP.NET – version 4.0 – which provides an incremental set of improvements to an already powerful platform. .NET 4.0 is a full release of the .NET Framework, unlike version 3.5, which was merely a set of library updates on top of the .NET Framework version 2.0. Because of this full framework revision, there has been a welcome bit of consolidation of assemblies and configuration settings. The full runtime version change to 4.0 also means that you have to explicitly pick version 4.0 of the runtime when you create a new Application Pool in IIS, unlike .NET 3.5, which actually requires version 2.0 of the runtime. In this first of two parts I'll take a look at some of the changes in the core ASP.NET runtime. In the next edition I'll go over improvements in Web Forms and Visual Studio. Core Engine Features Most of the high profile improvements in ASP.NET have to do with Web Forms, but there are a few gems in the core runtime that should make life easier for ASP.NET developers. The following list describes some of the things I've found useful among the new features. Clean web.config Files Are Back! If you've been using ASP.NET 3.5, you probably have noticed that the web.config file has turned into quite a mess of configuration settings between all the custom handler and module mappings for the various web server versions. Part of the reason for this mess is that .NET 3.5 is a collection of add-on components running on top of the .NET Runtime 2.0 and so almost all of the new features of .NET 3.5 where essentially introduced as custom modules and handlers that had to be explicitly configured in the config file. Because the core runtime didn't rev with 3.5, all those configuration options couldn't be moved up to other configuration files in the system chain. With version 4.0 a consolidation was possible, and the result is a much simpler web.config file by default. A default empty ASP.NET 4.0 Web Forms project looks like this: <?xml version="1.0"?> <configuration> <system.web> <compilation debug="true" targetFramework="4.0" /> </system.web> </configuration> Need I say more? Configuration Transformation Files to Manage Configurations and Application Packaging ASP.NET 4.0 introduces the ability to create multi-target configuration files. This means it's possible to create a single configuration file that can be transformed based on relatively simple replacement rules using a Visual Studio and WebDeploy provided XSLT syntax. The idea is that you can create a 'master' configuration file and then create customized versions of this master configuration file by applying some relatively simplistic search and replace, add or remove logic to specific elements and attributes in the original file. To give you an idea, here's the example code that Visual Studio creates for a default web.Release.config file, which replaces a connection string, removes the debug attribute and replaces the CustomErrors section: <?xml version="1.0"?> <configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"> <connectionStrings> <add name="MyDB" connectionString="Data Source=ReleaseSQLServer;Initial Catalog=MyReleaseDB;Integrated Security=True" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/> </connectionStrings> <system.web> <compilation xdt:Transform="RemoveAttributes(debug)" /> <customErrors defaultRedirect="GenericError.htm" mode="RemoteOnly" xdt:Transform="Replace"> <error statusCode="500" redirect="InternalError.htm"/> </customErrors> </system.web> </configuration> You can see the XSL transform syntax that drives this functionality. Basically, only the elements listed in the override file are matched and updated – all the rest of the original web.config file stays intact. Visual Studio 2010 supports this functionality directly in the project system so it's easy to create and maintain these customized configurations in the project tree. Once you're ready to publish your application, you can then use the Publish <yourWebApplication> option on the Build menu which allows publishing to disk, via FTP or to a Web Server using Web Deploy. You can also create a deployment package as a .zip file which can be used by the WebDeploy tool to configure and install the application. You can manually run the Web Deploy tool or use the IIS Manager to install the package on the server or other machine. You can find out more about WebDeploy and Packaging here: http://tinyurl.com/2anxcje. Improved Routing Routing provides a relatively simple way to create clean URLs with ASP.NET by associating a template URL path and routing it to a specific ASP.NET HttpHandler. Microsoft first introduced routing with ASP.NET MVC and then they integrated routing with a basic implementation in the core ASP.NET engine via a separate ASP.NET routing assembly. In ASP.NET 4.0, the process of using routing functionality gets a bit easier. First, routing is now rolled directly into System.Web, so no extra assembly reference is required in your projects to use routing. The RouteCollection class now includes a MapPageRoute() method that makes it easy to route to any ASP.NET Page requests without first having to implement an IRouteHandler implementation. It would have been nice if this could have been extended to serve *any* handler implementation, but unfortunately for anything but a Page derived handlers you still will have to implement a custom IRouteHandler implementation. ASP.NET Pages now include a RouteData collection that will contain route information. Retrieving route data is now a lot easier by simply using this.RouteData.Values["routeKey"] where the routeKey is the value specified in the route template (i.e., "users/{userId}" would use Values["userId"]). The Page class also has a GetRouteUrl() method that you can use to create URLs with route data values rather than hardcoding the URL: <%= this.GetRouteUrl("users",new { userId="ricks" }) %> You can also use the new Expression syntax using <%$RouteUrl %> to accomplish something similar, which can be easier to embed into Page or MVC View code: <a runat="server" href='<%$RouteUrl:RouteName=user, id=ricks %>'>Visit User</a> Finally, the Response object also includes a new RedirectToRoute() method to build a route url for redirection without hardcoding the URL. Response.RedirectToRoute("users", new { userId = "ricks" }); All of these routines are helpers that have been integrated into the core ASP.NET engine to make it easier to create routes and retrieve route data, which hopefully will result in more people taking advantage of routing in ASP.NET. To find out more about the routing improvements you can check out Dan Maharry's blog which has a couple of nice blog entries on this subject: http://tinyurl.com/37trutj and http://tinyurl.com/39tt5w5. Session State Improvements Session state is an often used and abused feature in ASP.NET and version 4.0 introduces a few enhancements geared towards making session state more efficient and to minimize at least some of the ill effects of overuse. The first improvement affects out of process session state, which is typically used in web farm environments or for sites that store application sensitive data that must survive AppDomain restarts (which in my opinion is just about any application). When using OutOfProc session state, ASP.NET serializes all the data in the session statebag into a blob that gets carried over the network and stored either in the State server or SQL Server via the Session provider. Version 4.0 provides some improvement in this serialization of the session data by offering an enableCompression option on the web.Config <Session> section, which forces the serialized session state to be compressed. Depending on the type of data that is being serialized, this compression can reduce the size of the data travelling over the wire by as much as a third. It works best on string data, but can also reduce the size of binary data. In addition, ASP.NET 4.0 now offers a way to programmatically turn session state on or off as part of the request processing queue. In prior versions, the only way to specify whether session state is available is by implementing a marker interface on the HTTP handler implementation. In ASP.NET 4.0, you can now turn session state on and off programmatically via HttpContext.Current.SetSessionStateBehavior() as part of the ASP.NET module pipeline processing as long as it occurs before the AquireRequestState pipeline event. Output Cache Provider Output caching in ASP.NET has been a very useful but potentially memory intensive feature. The default OutputCache mechanism works through in-memory storage that persists generated output based on various lifetime related parameters. While this works well enough for many intended scenarios, it also can quickly cause runaway memory consumption as the cache fills up and serves many variations of pages on your site. ASP.NET 4.0 introduces a provider model for the OutputCache module so it becomes possible to plug-in custom storage strategies for cached pages. One of the goals also appears to be to consolidate some of the different cache storage mechanisms used in .NET in general to a generic Windows AppFabric framework in the future, so various different mechanisms like OutputCache, the non-Page specific ASP.NET cache and possibly even session state eventually can use the same caching engine for storage of persisted data both in memory and out of process scenarios. For developers, the OutputCache provider feature means that you can now extend caching on your own by implementing a custom Cache provider based on the System.Web.Caching.OutputCacheProvider class. You can find more info on creating an Output Cache provider in Gunnar Peipman's blog at: http://tinyurl.com/2vt6g7l. Response.RedirectPermanent ASP.NET 4.0 includes features to issue a permanent redirect that issues as an HTTP 301 Moved Permanently response rather than the standard 302 Redirect respond. In pre-4.0 versions you had to manually create your permanent redirect by setting the Status and Status code properties – Response.RedirectPermanent() makes this operation more obvious and discoverable. There's also a Response.RedirectToRoutePermanent() which provides permanent redirection of route Urls. Preloading of Applications ASP.NET 4.0 provides a new feature to preload ASP.NET applications on startup, which is meant to provide a more consistent startup experience. If your application has a lengthy startup cycle it can appear very slow to serve data to clients while the application is warming up and loading initial resources. So rather than serve these startup requests slowly in ASP.NET 4.0, you can force the application to initialize itself first before even accepting requests for processing. This feature works only on IIS 7.5 (Windows 7 and Windows Server 2008 R2) and works in combination with IIS. You can set up a worker process in IIS 7.5 to always be running, which starts the Application Pool worker process immediately. ASP.NET 4.0 then allows you to specify site-specific settings by setting the serverAutoStartEnabled on a particular site along with an optional serviceAutoStartProvider class that can be used to receive "startup events" when the application starts up. This event in turn can be used to configure the application and optionally pre-load cache data and other information required by the app on startup.  The configuration settings need to be made in applicationhost.config: <sites> <site name="WebApplication2" id="1"> <application path="/" serviceAutoStartEnabled="true" serviceAutoStartProvider="PreWarmup" /> </site> </sites> <serviceAutoStartProviders> <add name="PreWarmup" type="PreWarmupProvider,MyAssembly" /> </serviceAutoStartProviders> Hooking up a warm up provider is optional so you can omit the provider definition and reference. If you do define it here's what it looks like: public class PreWarmupProvider System.Web.Hosting.IProcessHostPreloadClient { public void Preload(string[] parameters) { // initialization for app } } This code fires and while it's running, ASP.NET/IIS will hold requests from hitting the pipeline. So until this code completes the application will not start taking requests. The idea is that you can perform any pre-loading of resources and cache values so that the first request will be ready to perform at optimal performance level without lag. Runtime Performance Improvements According to Microsoft, there have also been a number of invisible performance improvements in the internals of the ASP.NET runtime that should make ASP.NET 4.0 applications run more efficiently and use less resources. These features come without any change requirements in applications and are virtually transparent, except that you get the benefits by updating to ASP.NET 4.0. Summary The core feature set changes are minimal which continues a tradition of small incremental changes to the ASP.NET runtime. ASP.NET has been proven as a solid platform and I'm actually rather happy to see that most of the effort in this release went into stability, performance and usability improvements rather than a massive amount of new features. The new functionality added in 4.0 is minimal but very useful. A lot of people are still running pure .NET 2.0 applications these days and have stayed off of .NET 3.5 for some time now. I think that version 4.0 with its full .NET runtime rev and assembly and configuration consolidation will make an attractive platform for developers to update to. If you're a Web Forms developer in particular, ASP.NET 4.0 includes a host of new features in the Web Forms engine that are significant enough to warrant a quick move to .NET 4.0. I'll cover those changes in my next column. Until then, I suggest you give ASP.NET 4.0 a spin and see for yourself how the new features can help you out. © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  

    Read the article

  • any clue in these logs why keyboard audio and internet are messed up

    - by mmj
    Jun 7 00:01:18 Isis lightdm: pam_unix(lightdm-autologin:session): session opened for user mimi by (uid=0) Jun 7 00:01:18 Isis lightdm: pam_ck_connector(lightdm-autologin:session): nox11 mode, ignoring PAM_TTY :0 Jun 7 00:01:26 Isis polkitd(authority=local): Registered Authentication Agent for unix-session:/org/freedesktop/ConsoleKit/Session1 (system bus name :1.36 [/usr/lib/policykit-1-gnome/polkit-gnome-authentication-agent-1], object path /org/gnome/PolicyKit1/AuthenticationAgent, locale zh_CN.UTF-8) Jun 7 00:01:29 Isis dbus[610]: [system] Rejected send message, 2 matched rules; type="method_call", sender=":1.44" (uid=1000 pid=1763 comm="/usr/lib/indicator-datetime/indicator-datetime-ser") interface="org.freedesktop.DBus.Properties" member="GetAll" error name="(unset)" requested_reply="0" destination=":1.15" (uid=0 pid=1219 comm="/usr/sbin/console-kit-daemon --no-daemon ") Jun 7 00:07:55 Isis sudo: pam_unix(sudo:auth): authentication failure; logname=mimi uid=1000 euid=0 tty=/dev/pts/1 ruser=mimi rhost= user=mimi Jun 7 00:08:11 Isis sudo: mimi : TTY=pts/1 ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/add-apt-repository ppa:colingille/freshlight Jun 7 00:08:11 Isis sudo: pam_unix(sudo:session): session opened for user root by mimi(uid=1000) Jun 7 00:08:32 Isis sudo: pam_unix(sudo:session): session closed for user root Jun 7 00:11:20 Isis sudo: mimi : TTY=pts/1 ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/apt-get install gparted Jun 7 00:11:20 Isis sudo: pam_unix(sudo:session): session opened for user root by mimi(uid=1000) Jun 7 00:11:59 Isis sudo: pam_unix(sudo:session): session closed for user root Jun 7 00:17:02 Isis CRON[2651]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 00:17:02 Isis CRON[2651]: pam_unix(cron:session): session closed for user root Jun 7 00:17:32 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 successfully authenticated as unix-user:mimi to gain ONE-SHOT authorization for action com.ubuntu.pkexec.gparted for unix-process:2655:96838 [/bin/sh /usr/bin/gparted-pkexec] (owned by unix-user:mimi) Jun 7 00:17:32 Isis pkexec: pam_unix(polkit-1:session): session opened for user root by (uid=1000) Jun 7 00:17:32 Isis pkexec: pam_ck_connector(polkit-1:session): cannot determine display-device Jun 7 00:17:32 Isis pkexec[2657]: mimi: Executing command [USER=root] [TTY=unknown] [CWD=/home/mimi] [COMMAND=/usr/sbin/gparted] Jun 7 00:48:15 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 successfully authenticated as unix-user:mimi to gain ONE-SHOT authorization for action com.ubuntu.pkexec.gparted for unix-process:3813:281120 [/bin/sh /usr/bin/gparted-pkexec] (owned by unix-user:mimi) Jun 7 00:48:15 Isis pkexec: pam_unix(polkit-1:session): session opened for user root by (uid=1000) Jun 7 00:48:15 Isis pkexec: pam_ck_connector(polkit-1:session): cannot determine display-device Jun 7 00:48:15 Isis pkexec[3815]: mimi: Executing command [USER=root] [TTY=unknown] [CWD=/home/mimi] [COMMAND=/usr/sbin/gparted] Jun 7 01:17:01 Isis CRON[3960]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 01:17:01 Isis CRON[3960]: pam_unix(cron:session): session closed for user root Jun 7 02:08:52 Isis gnome-screensaver-dialog: gkr-pam: unlocked login keyring Jun 7 02:17:01 Isis CRON[4246]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 02:17:01 Isis CRON[4246]: pam_unix(cron:session): session closed for user root Jun 7 02:17:05 Isis sudo: mimi : TTY=pts/1 ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/apt-get install unetbootin Jun 7 02:17:05 Isis sudo: pam_unix(sudo:session): session opened for user root by mimi(uid=1000) Jun 7 02:17:57 Isis sudo: pam_unix(sudo:session): session closed for user root Jun 7 02:18:59 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:18:59 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:18:59 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:18:59 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:18:59 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:18:59 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:18:59 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/unetbootin 'rootcheck=no' Jun 7 02:18:59 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 02:19:26 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:19:26 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:19:26 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:19:26 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:19:26 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:19:26 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:19:26 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/unetbootin 'rootcheck=no' Jun 7 02:19:26 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 02:33:21 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:33:21 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:33:21 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:33:21 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:33:21 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 02:33:21 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 02:33:21 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/unetbootin 'rootcheck=no' Jun 7 02:33:21 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 02:40:04 Isis sudo: mimi : TTY=pts/1 ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/unetbootin rootcheck=no Jun 7 02:40:04 Isis sudo: pam_unix(sudo:session): session opened for user root by mimi(uid=1000) Jun 7 03:17:01 Isis CRON[5506]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 03:17:01 Isis CRON[5506]: pam_unix(cron:session): session closed for user root Jun 7 03:33:24 Isis sudo: pam_unix(sudo:session): session closed for user root Jun 7 03:33:43 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 03:33:43 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 03:33:43 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 03:33:43 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 03:33:43 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 03:33:43 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 03:33:43 Isis sudo: mimi : 3 incorrect password attempts ; TTY=pts/1 ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/unetbootin showall=yes 'rootcheck=no' Jun 7 03:33:43 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 04:17:01 Isis CRON[6119]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 04:17:01 Isis CRON[6119]: pam_unix(cron:session): session closed for user root Jun 7 04:18:35 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 successfully authenticated as unix-user:mimi to gain TEMPORARY authorization for action org.debian.apt.install-or-remove-packages for system-bus-name::1.79 [/usr/bin/python /usr/bin/landscape-client-ui-install] (owned by unix-user:mimi) Jun 7 04:19:11 Isis groupadd[6702]: group added to /etc/group: name=landscape, GID=127 Jun 7 04:19:11 Isis groupadd[6702]: group added to /etc/gshadow: name=landscape Jun 7 04:19:11 Isis groupadd[6702]: new group: name=landscape, GID=127 Jun 7 04:19:11 Isis useradd[6706]: new user: name=landscape, UID=115, GID=127, home=/var/lib/landscape, shell=/bin/false Jun 7 04:19:12 Isis usermod[6711]: change user 'landscape' password Jun 7 04:19:12 Isis chage[6716]: changed password expiry for landscape Jun 7 04:19:37 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.canonical.LandscapeClientSettings.configure for unix-process:6146:1543697 [/usr/bin/python /usr/bin/landscape-client-settings-ui] (owned by unix-user:mimi) Jun 7 04:20:20 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.canonical.LandscapeClientSettings.configure for unix-process:6832:1555313 [/usr/bin/python /usr/bin/landscape-client-settings-ui] (owned by unix-user:mimi) Jun 7 04:21:04 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.ubuntu.languageselector.setsystemdefaultlanguage for unix-process:6827:1555123 [/usr/bin/python /usr/bin/gnome-language-selector] (owned by unix-user:mimi) Jun 7 04:21:08 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.ubuntu.languageselector.setsystemdefaultlanguage for unix-process:6827:1555123 [/usr/bin/python /usr/bin/gnome-language-selector] (owned by unix-user:mimi) Jun 7 04:21:44 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action org.debian.apt.install-or-remove-packages for system-bus-name::1.87 [/usr/bin/python /usr/bin/gnome-language-selector] (owned by unix-user:mimi) Jun 7 04:22:27 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 successfully authenticated as unix-user:mimi to gain TEMPORARY authorization for action com.canonical.LandscapeClientSettings.configure for unix-process:7830:1567424 [/usr/bin/python /usr/bin/landscape-client-settings-ui] (owned by unix-user:mimi) Jun 7 04:25:50 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.ubuntu.languageselector.setsystemdefaultlanguage for unix-process:7876:1584865 [/usr/bin/python /usr/bin/gnome-language-selector] (owned by unix-user:mimi) Jun 7 04:25:52 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.ubuntu.languageselector.setsystemdefaultlanguage for unix-process:7876:1584865 [/usr/bin/python /usr/bin/gnome-language-selector] (owned by unix-user:mimi) Jun 7 05:11:57 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 successfully authenticated as unix-user:mimi to gain TEMPORARY authorization for action org.debian.apt.install-or-remove-packages for system-bus-name::1.95 [/usr/bin/python /usr/bin/gnome-language-selector] (owned by unix-user:mimi) Jun 7 05:17:02 Isis CRON[8708]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 05:17:02 Isis CRON[8708]: pam_unix(cron:session): session closed for user root Jun 7 05:28:03 Isis lightdm: pam_unix(lightdm-autologin:session): session opened for user mimi by (uid=0) Jun 7 05:28:03 Isis lightdm: pam_ck_connector(lightdm-autologin:session): nox11 mode, ignoring PAM_TTY :0 Jun 7 05:28:17 Isis polkitd(authority=local): Registered Authentication Agent for unix-session:/org/freedesktop/ConsoleKit/Session1 (system bus name :1.32 [/usr/lib/policykit-1-gnome/polkit-gnome-authentication-agent-1], object path /org/gnome/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) Jun 7 05:28:32 Isis dbus[660]: [system] Rejected send message, 2 matched rules; type="method_call", sender=":1.44" (uid=1000 pid=1736 comm="/usr/lib/indicator-datetime/indicator-datetime-ser") interface="org.freedesktop.DBus.Properties" member="GetAll" error name="(unset)" requested_reply="0" destination=":1.17" (uid=0 pid=1333 comm="/usr/sbin/console-kit-daemon --no-daemon ") Jun 7 06:17:01 Isis CRON[2391]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 06:17:02 Isis CRON[2391]: pam_unix(cron:session): session closed for user root Jun 7 06:25:02 Isis CRON[2492]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 06:25:02 Isis CRON[2492]: pam_unix(cron:session): session closed for user root Jun 7 07:17:01 Isis CRON[3174]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 07:17:01 Isis CRON[3174]: pam_unix(cron:session): session closed for user root Jun 7 07:30:01 Isis CRON[3397]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 07:30:01 Isis CRON[3397]: pam_unix(cron:session): session closed for user root Jun 7 08:09:01 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:09:01 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:09:01 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:09:01 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:09:01 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:09:01 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:09:01 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/share/checkbox/backend --path=/usr/share/checkbox/scripts:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games /tmp/checkboxQbuE6V/input /tmp/checkboxQbuE6V/output Jun 7 08:09:01 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 08:09:59 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:09:59 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:09:59 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:09:59 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:09:59 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:09:59 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:09:59 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/share/checkbox/backend --path=/usr/share/checkbox/scripts:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games /tmp/checkboxQbuE6V/input /tmp/checkboxQbuE6V/output Jun 7 08:09:59 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 08:10:55 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:10:55 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:10:55 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:10:55 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:10:55 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 08:10:55 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 08:10:55 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/share/checkbox/backend --path=/usr/share/checkbox/scripts:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games /tmp/checkboxQbuE6V/input /tmp/checkboxQbuE6V/output Jun 7 08:10:55 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 08:17:01 Isis CRON[4215]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 08:17:01 Isis CRON[4215]: pam_unix(cron:session): session closed for user root Jun 7 09:17:02 Isis CRON[4766]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 09:17:02 Isis CRON[4766]: pam_unix(cron:session): session closed for user root Jun 7 10:17:02 Isis CRON[5046]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 10:17:02 Isis CRON[5046]: pam_unix(cron:session): session closed for user root Jun 7 11:17:02 Isis CRON[5325]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 11:17:02 Isis CRON[5325]: pam_unix(cron:session): session closed for user root Jun 7 12:17:01 Isis CRON[5617]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 12:17:01 Isis CRON[5617]: pam_unix(cron:session): session closed for user root Jun 7 13:07:51 Isis gnome-screensaver-dialog: pam_unix(gnome-screensaver:auth): authentication failure; logname= uid=1000 euid=1000 tty=:0.0 ruser= rhost= user=mimi Jun 7 13:07:51 Isis gnome-screensaver-dialog: pam_winbind(gnome-screensaver:auth): getting password (0x00000388) Jun 7 13:07:51 Isis gnome-screensaver-dialog: pam_winbind(gnome-screensaver:auth): pam_get_item returned a password Jun 7 13:07:51 Isis gnome-screensaver-dialog: pam_winbind(gnome-screensaver:auth): request wbcLogonUser failed: WBC_ERR_AUTH_ERROR, PAM error: PAM_USER_UNKNOWN (10), NTSTATUS: NT_STATUS_NO_SUCH_USER, Error message was: No such user Jun 7 13:08:03 Isis gnome-screensaver-dialog: pam_unix(gnome-screensaver:auth): conversation failed Jun 7 13:08:03 Isis gnome-screensaver-dialog: pam_unix(gnome-screensaver:auth): auth could not identify password for [mimi] Jun 7 13:08:03 Isis gnome-screensaver-dialog: pam_winbind(gnome-screensaver:auth): getting password (0x00000388) Jun 7 13:08:08 Isis lightdm: pam_unix(lightdm:session): session opened for user lightdm by (uid=0) Jun 7 13:08:08 Isis lightdm: pam_ck_connector(lightdm:session): nox11 mode, ignoring PAM_TTY :1 Jun 7 13:08:13 Isis lightdm: pam_succeed_if(lightdm:auth): requirement "user ingroup nopasswdlogin" not met by user "mimi" Jun 7 13:08:16 Isis dbus[660]: [system] Rejected send message, 2 matched rules; type="method_call", sender=":1.91" (uid=104 pid=5961 comm="/usr/lib/indicator-datetime/indicator-datetime-ser") interface="org.freedesktop.DBus.Properties" member="GetAll" error name="(unset)" requested_reply="0" destination=":1.17" (uid=0 pid=1333 comm="/usr/sbin/console-kit-daemon --no-daemon ") Jun 7 13:08:18 Isis dbus[660]: [system] Rejected send message, 2 matched rules; type="method_call", sender=":1.98" (uid=104 pid=5999 comm="/usr/lib/indicator-datetime/indicator-datetime-ser") interface="org.freedesktop.DBus.Properties" member="GetAll" error name="(unset)" requested_reply="0" destination=":1.17" (uid=0 pid=1333 comm="/usr/sbin/console-kit-daemon --no-daemon ") Jun 7 13:10:15 Isis lightdm: pam_unix(lightdm:session): session closed for user lightdm Jun 7 13:17:02 Isis CRON[6181]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 13:17:02 Isis CRON[6181]: pam_unix(cron:session): session closed for user root Jun 7 13:55:14 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 13:55:14 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 13:55:14 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 13:55:14 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 13:55:14 Isis sudo: pam_unix(sudo:auth): conversation failed Jun 7 13:55:14 Isis sudo: pam_unix(sudo:auth): auth could not identify password for [mimi] Jun 7 13:55:14 Isis sudo: mimi : 3 incorrect password attempts ; TTY=unknown ; PWD=/home/mimi ; USER=root ; COMMAND=/usr/bin/unetbootin 'rootcheck=no' Jun 7 13:55:14 Isis sudo: unable to execute /usr/sbin/sendmail: No such file or directory Jun 7 14:02:33 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.canonical.LandscapeClientSettings.configure for unix-process:6736:3087856 [/usr/bin/python /usr/bin/landscape-client-settings-ui] (owned by unix-user:mimi) Jun 7 14:02:51 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 FAILED to authenticate to gain authorization for action com.canonical.LandscapeClientSettings.configure for unix-process:6752:3089992 [/usr/bin/python /usr/bin/landscape-client-settings-ui] (owned by unix-user:mimi) Jun 7 14:03:14 Isis polkitd(authority=local): Operator of unix-session:/org/freedesktop/ConsoleKit/Session1 successfully authenticated as unix-user:mimi to gain TEMPORARY authorization for action com.canonical.LandscapeClientSettings.configure for unix-process:6763:3092515 [/usr/bin/python /usr/bin/landscape-client-settings-ui] (owned by unix-user:mimi) Jun 7 14:17:01 Isis CRON[6933]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 14:17:01 Isis CRON[6933]: pam_unix(cron:session): session closed for user root Jun 7 15:17:02 Isis CRON[7611]: pam_unix(cron:session): session opened for user root by (uid=0) Jun 7 15:17:02 Isis CRON[7611]: pam_unix(cron:session): session closed for user root

    Read the article

  • 64-bit Archives Needed

    - by user9154181
    A little over a year ago, we received a question from someone who was trying to build software on Solaris. He was getting errors from the ar command when creating an archive. At that time, the ar command on Solaris was a 32-bit command. There was more than 2GB of data, and the ar command was hitting the file size limit for a 32-bit process that doesn't use the largefile APIs. Even in 2011, 2GB is a very large amount of code, so we had not heard this one before. Most of our toolchain was extended to handle 64-bit sized data back in the 1990's, but archives were not changed, presumably because there was no perceived need for it. Since then of course, programs have continued to get larger, and in 2010, the time had finally come to investigate the issue and find a way to provide for larger archives. As part of that process, I had to do a deep dive into the archive format, and also do some Unix archeology. I'm going to record what I learned here, to document what Solaris does, and in the hope that it might help someone else trying to solve the same problem for their platform. Archive Format Details Archives are hardly cutting edge technology. They are still used of course, but their basic form hasn't changed in decades. Other than to fix a bug, which is rare, we don't tend to touch that code much. The archive file format is described in /usr/include/ar.h, and I won't repeat the details here. Instead, here is a rough overview of the archive file format, implemented by System V Release 4 (SVR4) Unix systems such as Solaris: Every archive starts with a "magic number". This is a sequence of 8 characters: "!<arch>\n". The magic number is followed by 1 or more members. A member starts with a fixed header, defined by the ar_hdr structure in/usr/include/ar.h. Immediately following the header comes the data for the member. Members must be padded at the end with newline characters so that they have even length. The requirement to pad members to an even length is a dead giveaway as to the age of the archive format. It tells you that this format dates from the 1970's, and more specifically from the era of 16-bit systems such as the PDP-11 that Unix was originally developed on. A 32-bit system would have required 4 bytes, and 64-bit systems such as we use today would probably have required 8 bytes. 2 byte alignment is a poor choice for ELF object archive members. 32-bit objects require 4 byte alignment, and 64-bit objects require 64-bit alignment. The link-editor uses mmap() to process archives, and if the members have the wrong alignment, we have to slide (copy) them to the correct alignment before we can access the ELF data structures inside. The archive format requires 2 byte padding, but it doesn't prohibit more. The Solaris ar command takes advantage of this, and pads ELF object members to 8 byte boundaries. Anything else is padded to 2 as required by the format. The archive header (ar_hdr) represents all numeric values using an ASCII text representation rather than as binary integers. This means that an archive that contains only text members can be viewed using tools such as cat, more, or a text editor. The original designers of this format clearly thought that archives would be used for many file types, and not just for objects. Things didn't turn out that way of course — nearly all archives contain relocatable objects for a single operating system and machine, and are used primarily as input to the link-editor (ld). Archives can have special members that are created by the ar command rather than being supplied by the user. These special members are all distinguished by having a name that starts with the slash (/) character. This is an unambiguous marker that says that the user could not have supplied it. The reason for this is that regular archive members are given the plain name of the file that was inserted to create them, and any path components are stripped off. Slash is the delimiter character used by Unix to separate path components, and as such cannot occur within a plain file name. The ar command hides the special members from you when you list the contents of an archive, so most users don't know that they exist. There are only two possible special members: A symbol table that maps ELF symbols to the object archive member that provides it, and a string table used to hold member names that exceed 15 characters. The '/' convention for tagging special members provides room for adding more such members should the need arise. As I will discuss below, we took advantage of this fact to add an alternate 64-bit symbol table special member which is used in archives that are larger than 4GB. When an archive contains ELF object members, the ar command builds a special archive member known as the symbol table that maps all ELF symbols in the object to the archive member that provides it. The link-editor uses this symbol table to determine which symbols are provided by the objects in that archive. If an archive has a symbol table, it will always be the first member in the archive, immediately following the magic number. Unlike member headers, symbol tables do use binary integers to represent offsets. These integers are always stored in big-endian format, even on a little endian host such as x86. The archive header (ar_hdr) provides 15 characters for representing the member name. If any member has a name that is longer than this, then the real name is written into a special archive member called the string table, and the member's name field instead contains a slash (/) character followed by a decimal representation of the offset of the real name within the string table. The string table is required to precede all normal archive members, so it will be the second member if the archive contains a symbol table, and the first member otherwise. The archive format is not designed to make finding a given member easy. Such operations move through the archive from front to back examining each member in turn, and run in O(n) time. This would be bad if archives were commonly used in that manner, but in general, they are not. Typically, the ar command is used to build an new archive from scratch, inserting all the objects in one operation, and then the link-editor accesses the members in the archive in constant time by using the offsets provided by the symbol table. Both of these operations are reasonably efficient. However, listing the contents of a large archive with the ar command can be rather slow. Factors That Limit Solaris Archive Size As is often the case, there was more than one limiting factor preventing Solaris archives from growing beyond the 32-bit limits of 2GB (32-bit signed) and 4GB (32-bit unsigned). These limits are listed in the order they are hit as archive size grows, so the earlier ones mask those that follow. The original Solaris archive file format can handle sizes up to 4GB without issue. However, the ar command was delivered as a 32-bit executable that did not use the largefile APIs. As such, the ar command itself could not create a file larger than 2GB. One can solve this by building ar with the largefile APIs which would allow it to reach 4GB, but a simpler and better answer is to deliver a 64-bit ar, which has the ability to scale well past 4GB. Symbol table offsets are stored as 32-bit big-endian binary integers, which limits the maximum archive size to 4GB. To get around this limit requires a different symbol table format, or an extension mechanism to the current one, similar in nature to the way member names longer than 15 characters are handled in member headers. The size field in the archive member header (ar_hdr) is an ASCII string capable of representing a 32-bit unsigned value. This places a 4GB size limit on the size of any individual member in an archive. In considering format extensions to get past these limits, it is important to remember that very few archives will require the ability to scale past 4GB for many years. The old format, while no beauty, continues to be sufficient for its purpose. This argues for a backward compatible fix that allows newer versions of Solaris to produce archives that are compatible with older versions of the system unless the size of the archive exceeds 4GB. Archive Format Differences Among Unix Variants While considering how to extend Solaris archives to scale to 64-bits, I wanted to know how similar archives from other Unix systems are to those produced by Solaris, and whether they had already solved the 64-bit issue. I've successfully moved archives between different Unix systems before with good luck, so I knew that there was some commonality. If it turned out that there was already a viable defacto standard for 64-bit archives, it would obviously be better to adopt that rather than invent something new. The archive file format is not formally standardized. However, the ar command and archive format were part of the original Unix from Bell Labs. Other systems started with that format, extending it in various often incompatible ways, but usually with the same common shared core. Most of these systems use the same magic number to identify their archives, despite the fact that their archives are not always fully compatible with each other. It is often true that archives can be copied between different Unix variants, and if the member names are short enough, the ar command from one system can often read archives produced on another. In practice, it is rare to find an archive containing anything other than objects for a single operating system and machine type. Such an archive is only of use on the type of system that created it, and is only used on that system. This is probably why cross platform compatibility of archives between Unix variants has never been an issue. Otherwise, the use of the same magic number in archives with incompatible formats would be a problem. I was able to find information for a number of Unix variants, described below. These can be divided roughly into three tribes, SVR4 Unix, BSD Unix, and IBM AIX. Solaris is a SVR4 Unix, and its archives are completely compatible with those from the other members of that group (GNU/Linux, HP-UX, and SGI IRIX). AIX AIX is an exception to rule that Unix archive formats are all based on the original Bell labs Unix format. It appears that AIX supports 2 formats (small and big), both of which differ in fundamental ways from other Unix systems: These formats use a different magic number than the standard one used by Solaris and other Unix variants. They include support for removing archive members from a file without reallocating the file, marking dead areas as unused, and reusing them when new archive items are inserted. They have a special table of contents member (File Member Header) which lets you find out everything that's in the archive without having to actually traverse the entire file. Their symbol table members are quite similar to those from other systems though. Their member headers are doubly linked, containing offsets to both the previous and next members. Of the Unix systems described here, AIX has the only format I saw that will have reasonable insert/delete performance for really large archives. Everyone else has O(n) performance, and are going to be slow to use with large archives. BSD BSD has gone through 4 versions of archive format, which are described in their manpage. They use the same member header as SVR4, but their symbol table format is different, and their scheme for long member names puts the name directly after the member header rather than into a string table. GNU/Linux The GNU toolchain uses the SVR4 format, and is compatible with Solaris. HP-UX HP-UX seems to follow the SVR4 model, and is compatible with Solaris. IRIX IRIX has 32 and 64-bit archives. The 32-bit format is the standard SVR4 format, and is compatible with Solaris. The 64-bit format is the same, except that the symbol table uses 64-bit integers. IRIX assumes that an archive contains objects of a single ELFCLASS/MACHINE, and any archive containing ELFCLASS64 objects receives a 64-bit symbol table. Although they only use it for 64-bit objects, nothing in the archive format limits it to ELFCLASS64. It would be perfectly valid to produce a 64-bit symbol table in an archive containing 32-bit objects, text files, or anything else. Tru64 Unix (Digital/Compaq/HP) Tru64 Unix uses a format much like ours, but their symbol table is a hash table, making specific symbol lookup much faster. The Solaris link-editor uses archives by examining the entire symbol table looking for unsatisfied symbols for the link, and not by looking up individual symbols, so there would be no benefit to Solaris from such a hash table. The Tru64 ld must use a different approach in which the hash table pays off for them. Widening the existing SVR4 archive symbol tables rather than inventing something new is the simplest path forward. There is ample precedent for this approach in the ELF world. When ELF was extended to support 64-bit objects, the approach was largely to take the existing data structures, and define 64-bit versions of them. We called the old set ELF32, and the new set ELF64. My guess is that there was no need to widen the archive format at that time, but had there been, it seems obvious that this is how it would have been done. The Implementation of 64-bit Solaris Archives As mentioned earlier, there was no desire to improve the fundamental nature of archives. They have always had O(n) insert/delete behavior, and for the most part it hasn't mattered. AIX made efforts to improve this, but those efforts did not find widespread adoption. For the purposes of link-editing, which is essentially the only thing that archives are used for, the existing format is adequate, and issues of backward compatibility trump the desire to do something technically better. Widening the existing symbol table format to 64-bits is therefore the obvious way to proceed. For Solaris 11, I implemented that, and I also updated the ar command so that a 64-bit version is run by default. This eliminates the 2 most significant limits to archive size, leaving only the limit on an individual archive member. We only generate a 64-bit symbol table if the archive exceeds 4GB, or when the new -S option to the ar command is used. This maximizes backward compatibility, as an archive produced by Solaris 11 is highly likely to be less than 4GB in size, and will therefore employ the same format understood by older versions of the system. The main reason for the existence of the -S option is to allow us to test the 64-bit format without having to construct huge archives to do so. I don't believe it will find much use outside of that. Other than the new ability to create and use extremely large archives, this change is largely invisible to the end user. When reading an archive, the ar command will transparently accept either form of symbol table. Similarly, the ELF library (libelf) has been updated to understand either format. Users of libelf (such as the link-editor ld) do not need to be modified to use the new format, because these changes are encapsulated behind the existing functions provided by libelf. As mentioned above, this work did not lift the limit on the maximum size of an individual archive member. That limit remains fixed at 4GB for now. This is not because we think objects will never get that large, for the history of computing says otherwise. Rather, this is based on an estimation that single relocatable objects of that size will not appear for a decade or two. A lot can change in that time, and it is better not to overengineer things by writing code that will sit and rot for years without being used. It is not too soon however to have a plan for that eventuality. When the time comes when this limit needs to be lifted, I believe that there is a simple solution that is consistent with the existing format. The archive member header size field is an ASCII string, like the name, and as such, the overflow scheme used for long names can also be used to handle the size. The size string would be placed into the archive string table, and its offset in the string table would then be written into the archive header size field using the same format "/ddd" used for overflowed names.

    Read the article

  • CodePlex Daily Summary for Thursday, January 20, 2011

    CodePlex Daily Summary for Thursday, January 20, 2011Popular ReleasesAutoLoL: AutoLoL v1.5.4: Added champion: Renekton Removed automatic file association Fix: The recent files combobox didn't always open a file when an item was selected Fix: Removing a recently opened file caused an errorRazorEngine: RazorEngine v2.0: IMPORTANT RazorEngine v2 is a complete rewrite of the templating framework. Because of this, there are no guarantees that code written around v1.2 will run without modification. The v2 framework is cleaner and refined. Please check the forthcoming codeplex site updates which will detail the new functionality. Features in v2: ASP.NET Medium Trust Support Custom Template Activation with Dependency Injection Support Subtemplating using the new @Include method. Known issues: @Include doe...AutoSPInstaller: AutoSPInstaller for SharePoint 2010 (v2 Beta): New package consisting of AutoSPInstaller v2 files. If you have not previously downloaded AutoSPInstaller, or are interested in the new functionality and format, this is the recommended release.DotNetNuke® Community Edition: 05.06.01: Major Highlights Fixed issue to remove preCondition checks when upgrading to .Net 4.0 Fixed issue where some valid domains were failing email validation checks. Fixed issue where editing Host menu page settings assigns the page to a Portal. Fixed issue which caused XHTML validation problems in 5.6.0 Fixed issue where an aspx page in any subfolder was inaccessible. Fixed issue where Config.Touch method signature had an unintentional breaking change in 5.6.0 Fixed issue which caused...MiniTwitter: 1.65: MiniTwitter 1.65 ???? ?? List ????? in-reply-to ???????? ????????????????????????? ?? OAuth ????????????????????????????Opalis Community Releases: Integration Pack for Windows Tasks: NOTE: There are two objects in this Integration Pack, each one has multiple uses. Each object comes with its own documentation. See below for a brief description of each: The Integration Pack Object - File System MaintenanceThis object extends the current Foundation Object Functionality within OIS. Specifically, this object extends the functionality of the Delete File, Delete Folder, and Get File Status Objects. It also works well with many of the Foundation Objects which can process the inf...ASP.net Ribbon: Version 2.1: Tadaaa... So Version 2.1 brings a lot of things... Have a look at the homepage to see what's new. Also, I wanted to (really) improve the Designer. I wanted to add great things... but... it took to much time. And as some of you were waiting for fixes, I decided just to fix bugs and add some features. So have a look at the demo app to see new features. Thanks ! (You can expect some realeses if bugs are not fixed correctly... 2.2, 2.3, 2.4....)iTracker Asp.Net Starter Kit: Version 3.0.0: This is the inital release of the version 3.0.0 Visual Studio 2010 (.Net 4.0) remake of the ITracker application. I connsider this a working, stable application but since there are still some features missing to make it "complete" I'm leaving it listed as a "beta" release. I am hoping to make it feature complete for v3.1.0 but anything is possible.mytrip.mvc (CMS & e-Commerce): mytrip.mvc 1.0.52.1 beta 2: New MVC3 RTM fix bug: Dropdown select fix bug: Add Store/Department and Add Store/Produser WEB.mytrip.mvc 1.0.52.1 Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) SRC.mytrip.mvc 1.0.52.1 System Requirements: Visual Studio 2010 or Web Deweloper 2010 MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) Connector/Net...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.6.1: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager changes: RenderView controller extension works for razor also live demo switched to razorBloodSim: BloodSim - 1.3.3.1: - Priority update to resolve a bug that was causing Boss damage to ignore Blood Shields entirelyRawr: Rawr 4.0.16 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of this release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you have a problem, please follow the Posting Guidelines and put it into the Issue Tracker. W...MvcContrib: an Outer Curve Foundation project: MVC 3 - 3.0.51.0: Please see the Change Log for a complete list of changes. MVC BootCamp Description of the releases: MvcContrib.Release.zip MvcContrib.dll MvcContrib.TestHelper.dll MvcContrib.Extras.Release.zip T4MVC. The extra view engines / controller factories and other functionality which is in the project. This file includes the main MvcContrib assembly. Samples are included in the release. You do not need MvcContrib if you download the Extras.N2 CMS: 2.1.1: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. 2.1.1 Maintenance release List of changes 2.1 Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the template packs (above) and open...DNN Menu Provider: 01.00.00 (DNN 5.X and newer): DNN Menu Provider v1.0.0Our first release build is a stable release. It requires at least DotNetNuke 5.1.0. Previous DNN versions are not suported. Major Highlights: Easy to use template system build on a selfmade and powerful engine. Ready for multilingual websites. A flexible translation integration based on the ASP.NET Provider Model as well as an build in provider for DNN Localization (DNN 5.5.X > required) and an provider for EALO Translation API. Advanced in-memory caching function...TweetSharp: TweetSharp v2.0.0.0 - Preview 8: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 8 ChangesUpgraded library references to address reported regressions Third Party Library VersionsHammock v1.1.6: http://hammock.codeplex.com Json.NET 4.0 Release 1: http://json.codeplex.comVidCoder: 0.8.1: Adds ability to choose an arbitrary range (in seconds or frames) to encode. Adds ability to override the title number in the output file name when enqueing multiple titles. Updated presets: Added iPhone 4, Apple TV 2, fixed some existing presets that should have had weightp=0 or trellis=0 on them. Added {parent} option to auto-name format. Use {parent:2} to refer to a folder 2 levels above the input file. Added {title:2} option to auto-name format. Adds leading zeroes to reach the sp...Microsoft SQL Server Product Samples: Database: AdventureWorks2008R2 without filestream: This download contains a version of the AdventureWorks2008R2 OLTP database without FILESTREAM properties. You do not need to have filestream enabled to attach this database. No additional schema or data changes have been made. To install the version of AdventureWorks2008R2 that includes filestream, use the SR1 installer available here. Prerequisites: Microsoft SQL Server 2008 R2 must be installed. Full-Text Search must be enabled. Installing the AdventureWorks2008R2 OLTP database: 1. Cl...ASP.NET: ASP.NET MVC 3 Application Upgrader: This standalone application upgrades ASP.NET MVC 2 applications to ASP.NET MVC 3. It works for both ASP.NET MVC 3 RC 2 and RTM. The tool only supports Visual Studio 2010 solutions and MVC 2 projects targeting .NET 4. It will not work with VS 2008 solutions, MVC 1 projects, or projects targeting .NET 3.5. Those projects will first have to be upgraded using Visual Studio 2010 and/or retargeted for .NET 4. The tool will: Create a backup of your entire solution Update all Web Applications and ...NuGet: NuGet 1.0 RTM: NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog. The URL to the package OData feed is: http://go.microsoft.com/fwlink/?LinkID=206669New ProjectsAnyDbTest: AnyDbTest is the first and only automated DB unit testing tool available using XML as test case. AnyDbTest is just designed for DBA/DB developers. It supports all kinds of popular database, such as Oracle, SQL Server, MySQL, and PostgreSQL etc. BizTalk Context Adder Pipeline Component: Context Property Adder pipeline component, which utilize XML configuration.BOQ: Bill Of Quantity... WinForms ProjectCascading System: The cascading components of the CASA simulation system. This is to be used with the CASA System.Condobook: Projeto para estudo de ASP.NET MVC do grupo SharpShooters.Crystal Report WCF Services: This project is a mini WCF web services i built which allow web-server which does not have Crystal report server installed to generate crystal report via WCFCultiv Photometadata for Umbraco: The PhotoMetaData package will extract meta data from images that you upload in your Umbraco media section. Dots and Boxes: boxesEasy Sound Miage: Projet d'année M1File Encoding Checker: File Encoding Checker is a GUI tool that allows you to validate the text encoding of one or more files. The tool can display the encoding for all selected files, or only the files that do not have the encodings you specify. File Encoding Checker requires .NET 2 or above to run.Intuition Logic C++ Library: A C++ library of computer vision, machine learning and optimization, etc. based on design patterns that make it easy to integrate to your existing project. LingDong: LingDong is an open-source search engine implanted by C#&ASP.NET. It includes Index, Rank and Query, but not Crawler.LiveSpaceBackupTool: This tool helps you to back up your Windows Live Spaces blogs and comments before they are forever gone. Options offered by Microsoft are to download a local copy or migrate to wordpress.com, but unfortunately some details will be lost if you do that. Use this tool to avoid it.My IE Accelerators: 1. dict.cn ie accelaratorOrchard Anti Spam Module: The Orchard Anti Spam Module is an API used by other Orchard modules to prevent users from entering messages categorized as spam in forms. It uses online spam services like Akismet and TypePad.Orchard Contact User Module: Orchard module for users to be able to contact each other through messages.PortfolioEngine.Net: PortfolioEngine.Net is an easy to use showcase engine to present the projects you worked on. It is developed in C#, using asp.net mvc 2 on top of .Net 3.5PoshWSUS: This is a module designed to help fill in a gap where a System Administrator can perform WSUS commands against the server via the command line vs working using the GUI. Script Helper: Easily run large and continuously changing batches of scripts.SetDefaultAudioEndpoint: SetDefaultAudioEndpoint (Set Default Audio Endpoint) allows users to set the default audio playback device using computer code. The solution opens the Sound Control Panel and uses the Microsoft UI Automation accessibility framework to make changes.SharePoint 2010 Acknowledgement Page: SharePoint 2010 Acknowledgement Solution. Solution deploys an acknowledgement page using HTTP module to check SharePoint 2010 user profile every time user goes to site and presents users with acknowledgement page they must accept prior to accessing site.SimpleML: SimpleML is an efficient non-extractive parser for an XML-like markup language. It supports navigation similar to SAX and DOM, ability to add/delete elements, and full macro expansion similar to PHP using Lua. It is programmed in C++ and is fully Unicode compliant.SLRTC#: To be continuedStreamBase Community Extensions: StreamBase Community Extensions (SBCX) is aimed at providing a Microsoft Rx-based .NET class library, code-generator (like SQL metal), and a widely useful set of Windows PowerShell cmdlets, providers, aliases, filters, functions and scripts for StreamBase.VS2010 Context Menu Fix: This utility fixes annoying quirk of context menu in VS2010 Document Editor which impedes context menu navigation via keyboard.WinEye: A combonation between spy++ and hawkeye, its meant to pull apart native and managed windows for further investigation

    Read the article

  • CodePlex Daily Summary for Thursday, December 16, 2010

    CodePlex Daily Summary for Thursday, December 16, 2010Popular ReleasesSharpDropBox Client for .NET: WP7 SharpDropBox Client - 0.1 Technology Preview: I decided to go ahead and release this. It works well for simple browsing folder structure/downloading files (and login works). See samples for an example of how to use it. I am in progress with a couple other methods which aren't currently working.SQL Monitor: SQL Monitor 2.9: 1. automatically set sql for new query if a object is selected(table/sp/function/view)uComponents: uComponents v2.0.1: This release: Fixes a critical issue with IIS 6. Removes various script error issues in Internet Explorer Supports the media picker with advanced dialog and preview. uComponents is a collaborative project for creating components for Umbraco including data types, XSLT extensions, controls and more. Containing 20 data-types, 7 XSLT extension libraries, keyboard short-cuts, drag-n-drop functionality, as well as great developer utilities - uComponents is one of the must-have packages for a...SplendidCRM: SplendidCRM 5.0 Community Edition: SplendidCRM Software has adopted the GNU Affero General Public License Version 3 (AGPLv3) for its Community Edition. This release includes the full set of SQL source code in the Community Edition, something that was previously only available in the Professional and Enterprise Editions. An article on the subject of Commercial Open-Source licensing has been posted at http://www.codeproject.com/KB/architecture/splendid-guide-article6.aspx.DotSpatial: DotSpatial 12-15-2010: This release contains a few minor bug fixes and hopefully the GDAL libraries for the 3.5 x86 build actually built to the correct directory this time.DotNetNuke® Community Edition: 05.06.01 Beta: This is the initial Beta of DotNetNuke 5.6.1. See the DotNetNuke Roadmap a full list of changes in this release.MSBuild Extension Pack: December 2010: Release Blog Post The MSBuild Extension Pack December 2010 release provides a collection of over 380 MSBuild tasks. A high level summary of what the tasks currently cover includes the following: System Items: Active Directory, Certificates, COM+, Console, Date and Time, Drives, Environment Variables, Event Logs, Files and Folders, FTP, GAC, Network, Performance Counters, Registry, Services, Sound Code: Assemblies, AsyncExec, CAB Files, Code Signing, DynamicExecute, File Detokenisation, GU...Access Control Service Samples and Documentation (Labs): Samples-R3: Contains latest ACS samples (corresponding to R3 release) that show how to integrate ACS with web services, ASP.NET websites (Web Forms and MVC) and on how to interact with the ACS Management Service. The Readmes for these samples are available here.TweetSharp: TweetSharp v2.0.0.0 - Preview 5: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 5 ChangesMaintenance release with user reported fixes Preview 4 ChangesReintroduced fluent interface support via satellite assembly Added entities support, entity segmentation, and ITweetable/ITweeter interfaces for client development Numerous fixes reported by preview users Preview 3 ChangesNumerous ...EnhSim: EnhSim 2.2.2 ALPHA: 2.2.2 ALPHAThis release adds in the changes for 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - The spirit ...Silverlight Contrib: Silverlight Contrib 2010.1.0: 2010.1.0 New FeaturesCompatibility Release for Silverlight 4 and Visual Studio 2010FlickrNet API Library: 3.1.4000: Newest release. Now contains dedicated Windows Phone 7 DLL as well as all previous DLLs. Also contains Windows Help file documentation now as standard.mojoPortal: 2.3.5.8: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2358-released.aspx Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Microsoft All-In-One Code Framework: Visual Studio 2010 Code Samples 2010-12-13: Code samples for Visual Studio 2010Wii Backup Fusion: Wii Backup Fusion 0.9 Beta: - Aqua or brushed metal style for Mac OS X - Shows selection count beside ID - Game list selection mode via settings - Compare Files <-> WBFS game lists - Verify game images/DVD/WBFS - WIT command line for log (via settings) - Cancel possibility for loading games process - Progress infos while loading games - Localization for dates - UTF-8 support - Shortcuts added - View game infos in browser - Transfer infos for log - All transfer routines rewritten - Extract image from image/WBFS - Support....NETTER Code Starter Pack: v1.0.beta: '.NETTER Code Starter Pack ' contains a gallery of Visual Studio 2010 solutions leveraging latest and new technologies and frameworks based on Microsoft .NET Framework. Each Visual Studio solution included here is focused to provide a very simple starting point for cutting edge development technologies and framework, using well known Northwind database (for database driven scenarios). The current release of this project includes starter samples for the following technologies: ASP.NET Dynamic...NuGet (formerly NuPack): NuGet 1.0 Release Candidate: NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog. This new build targets the newer feed (http://go.microsoft.com/fwlink/?LinkID=206669) and package format. See http://nupack.codeplex.com/documentation?title=Nuspe...Free Silverlight & WPF Chart Control - Visifire: Visifire Silverlight, WPF Charts v3.6.5 Released: Hi, Today we are releasing final version of Visifire, v3.6.5 with the following new feature: * New property AutoFitToPlotArea has been introduced in DataSeries. AutoFitToPlotArea will bring bubbles inside the PlotArea in order to avoid clipping of bubbles in bubble chart. You can visit Visifire documentation to know more. http://www.visifire.com/visifirechartsdocumentation.php Also this release includes few bug fixes: * Chart threw exception while adding new Axis in Chart using Vi...PHPExcel: PHPExcel 1.7.5 Production: DonationsDonate via PayPal via PayPal. If you want to, we can also add your name / company on our Donation Acknowledgements page. PEAR channelWe now also have a full PEAR channel! Here's how to use it: New installation: pear channel-discover pear.pearplex.net pear install pearplex/PHPExcel Or if you've already installed PHPExcel before: pear upgrade pearplex/PHPExcel The official page can be found at http://pearplex.net. Want to contribute?Please refer the Contribute page.??????????: All-In-One Code Framework ??? 2010-12-10: ?????All-In-One Code Framework(??) 2010?12??????!!http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 ?????release?,???????ASP.NET, WinForm, Silverlight????12?Sample Code。???,??????????sample code。 ?????:http://blog.csdn.net/sjb5201/archive/2010/12/13/6072675.aspx ??,??????MSDN????????????。 http://social.msdn.microsoft.com/Forums/zh-CN/codezhchs/threads ?????????????????,??Email ????New ProjectsAchievement Information System: AchIS is a software which documented all about student's prestigeAutoMock: AutoMock saves you time when writing unit tests by wiring up all the dependencies as mocks for the class under test.Basic samples: Basic C# samplesBCrypt.Net: A .Net port of jBCrypt implemented in C#. It uses a variant of the Blowfish encryption algorithm’s keying schedule, and introduces a work factor, which allows you to determine how expensive the hash function will be, allowing the algorithm to be "future-proof".Desafio Dot.Net: Projeto para o Desafio DotNetEDAFramework: This is a Estimation of Distribution Algorithm framework.Fluxions: Fluxions is a 3D Graphics Engine designed to make writing simple graphics or visualization projects easy. It is ideal for writing computer games or scientific type work where you need to easily prototype an idea. It includes support for writing OpenGL apps with GLUT or SDL.Guruzan.Com: Guruzan.Com is a new way to share your ideas and especially your claims.. We are about the create a next generation social network which will bring improved standarts and innovations for internet sharing. Our project includes many students from different countries...HotCold: This game loads a board of squares, of which one is randomly selected as the "HOT" square. The goal is to find the "HOT" square in the fewest clicks possible. .InSim.NET: A .NET InSim library for the racing simulator Live for Speed.Javascript Utils: JavaScript Utilitieskoppees: Mingi veebiliidesKVB-Abfahrtstafel: Die KVB-Abfahrtstafel erlaubt es, sich einige Haltestellentafeln im Netz der Kölner Verkehrsbetriebe - inklusive Störungsmeldungen - schon vor dem Verlassen des Büros/Hauses/Hotels am PC anzusehen. Besonders bei miesem Wetter eine echte Erleichterung.Lessons in PivotViewer: The Silverlight PivotViewer is a great control with a great deal of potential. However, there are a lot of questions on how to use and customize the PivotViewer. This project will attempt to answer those questions by providing a series of lessons on the PivotViewer.Liuyi: liuyiLiuyi.network: Liuyi.networkMiniDouBan: MiniDouBanN2F Yverdon Database Helper: A class to aid in performing simple database queries within N2F Yverdon. Also provides the capability to store queries for later use.N2F Yverdon Form Helper: A system that allows interaction with web forms in a fashion similar to that of ASP.NET MVC model binding.N2F Yverdon Sanitizers: A system to help ease the pain of sanitizing data. The system comes with some basic sanitizers as well as the framework necessary to enable easy development of custom sanitizers.Next Question: By Relavance: Next Question makes it easier for System Users, Administrators, Business Analysts, etc to run ad-hoc queries on any data store. Regardless of the type or location of the information, Next Question removes the need for T-SQL programmers to write explicit queries to answer businessNPicConvertDemo: azure test/demo/example projectOpenSprints: OpenSprints for WindowsPML_FahrradVerleih: PML_FahrradVerleih PMS ProjektPowerLib: PowerLib extends .NET Base Class Library functionality with algorithms and data structures. First release would be at next week.ProJect Manager Software: Project Manager Softwareqxtplatform: qxtplatformSencha Direct Stack based on ASP.NET MVC: This project focuses on creating ExtJS/Sencha Direct server stack using ASP.NET MVC.SqlExecuter: Simply run sql scripts against a SQL Server databaseSystem8: Experimental Silverlight project to help understand system 8's behavior (mehod of calibration bottom-quark tagging).TeamManager: Team management software based on silverlight application. It use PRISM and MEF to have flexible and extensible architecture.Umbraco Single Item Picker: Extention to the Umbraco, needs to pick: - Content pages, - Media items, - File System Objects, - Member Items in more "better way" additionally control returns not only item name(title) but also extended information (path, icon). WebSharper Community Samples: WebSharper Community SamplesWindows Phone 7 Charts: Charting component for Windows Phone 7 applications.Windows Phone 7 Silverlight ZXing Barcode Scanning Library: ZXing (pronounced "zebra crossing") is an open-source, multi-format 1D/2D barcode image processing library originally implemented by Google in Java. This is a Silverlight port of the csharp ZXing port created by Suraj Supekar at revision 1202 in the SVN repository. WPF RCON: WPF RCON is a RCON client for COD4 servers. It's RCON library + WPF GUI. It's developed in C#. It's far from being complete.

    Read the article

  • MINSCN?Cache Fusion Read Consistent

    - by Liu Maclean(???)
    ????? ???Ask Maclean Home ???  RAC ? Past Image PI????, ?????????,???11.2.0.3 2 Node RAC??????????: SQL> select * from v$version; BANNER -------------------------------------------------------------------------------- Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE 11.2.0.3.0 Production TNS for Linux: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.0.3.0 - Production SQL> select * from global_name; GLOBAL_NAME -------------------------------------------------------------------------------- www.oracledatabase12g.com SQL> drop table test purge; Table dropped. SQL> alter system flush buffer_cache; System altered. SQL> create table test(id number); insert into test values(1); insert into test values(2); commit; /* ???? rowid??TEST????2????????? */ select dbms_rowid.rowid_block_number(rowid),dbms_rowid.rowid_relative_fno(rowid) from test; DBMS_ROWID.ROWID_BLOCK_NUMBER(ROWID) DBMS_ROWID.ROWID_RELATIVE_FNO(ROWID) ------------------------------------ ------------------------------------                                89233                                    1                                89233                                    1 SQL> alter system flush buffer_cache; System altered. Instance 1 Session A ??UPDATE??: SQL> update test set id=id+1 where id=1; 1 row updated. Instance 1 Session B ??x$BH buffer header?? ?? ??Buffer??? SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;      STATE CR_SCN_BAS ---------- ----------          1          0          3    1227595 X$BH ??? STATE????Buffer???, ???????: STATE NUMBER KCBBHFREE 0 buffer free KCBBHEXLCUR 1 buffer current (and if DFS locked X) KCBBHSHRCUR 2 buffer current (and if DFS locked S) KCBBHCR 3 buffer consistant read KCBBHREADING 4 Being read KCBBHMRECOVERY 5 media recovery (current & special) KCBBHIRECOVERY 6 Instance recovery (somewhat special) ????????????? : state =1 Xcurrent ? state=2 Scurrent ? state=3 CR ??? Instance 2  ?? ????????????? ,???? gc current block 2 way  ??Current Block ??? Instance 2, ?? Instance 1 ??”Current Block” Convert ? Past Image: Instance 2 Session C SQL> update test set id=id+1 where id=2; 1 row updated. Instance 2 Session D SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1227641 3 1227638 STATE =1 ?Xcurrent block???? Instance 2 , ??? Instance 1 ??? GC??: Instance 1 Session B SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;      STATE CR_SCN_BAS ---------- ----------          3    1227641          3    1227638          8          0          3    1227595 ???????, ??????Instance 1??session A????TEST??SELECT??? ,????? 3? State=3?CR ? ??????1?: Instance 1 session A ?????UPDATE? session SQL> alter session set events '10046 trace name context forever,level 8:10708 trace name context forever,level 103: trace[rac.*] disk high'; Session altered. SQL> select * from test;         ID ----------          2          2 select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;       STATE CR_SCN_BAS ---------- ----------          3    1227716          3    1227713          8          0 ?????????v$BH ????CR????,?????SELECT??? CR????????,???????? ?????????? ??X$BH?????? , ?????CR??SCN Version???: 1227641?1227638?1227595, ?SELECT?????? 2???SCN version?CR? 1227716? 1227713 ???, Oracle????????? ?????????SELECT??????event 10708? rac.*???TRACE,??????TRACE??: PARSING IN CURSOR #140444679938584 len=337 dep=1 uid=0 oct=3 lid=0 tim=1335698913632292 hv=3345277572 ad='bc0e68c8' sqlid='baj7tjm3q9sn4' SELECT /* OPT_DYN_SAMP */ /*+ ALL_ROWS IGNORE_WHERE_CLAUSE NO_PARALLEL(SAMPLESUB) opt_param('parallel_execution_enabled', 'false') NO_PARALLEL_INDEX(SAMPLESUB) NO_SQL_TUNE */ NVL(SUM(C1),0), NVL(SUM(C2),0) FROM (SELECT /*+ NO_PARALLEL("TEST") FULL("TEST") NO_PARALLEL_INDEX("TEST") */ 1 AS C1, 1 AS C2 FROM "SYS"."TEST" "TEST") SAMPLESUB END OF STMT PARSE #140444679938584:c=1000,e=27630,p=0,cr=0,cu=0,mis=1,r=0,dep=1,og=1,plh=1950795681,tim=1335698913632252 EXEC #140444679938584:c=0,e=44,p=0,cr=0,cu=0,mis=0,r=0,dep=1,og=1,plh=1950795681,tim=1335698913632390 *** 2012-04-29 07:28:33.632 kclscrs: req=0 block=1/89233 *** 2012-04-29 07:28:33.632 kclscrs: bid=1:3:1:0:7:80:1:0:4:0:0:0:1:2:4:1:26:0:0:0:70:1a:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:4:3:2:1:2:0:3f:0:1c:86:2d:4:0:0:0:0:a2:3c:7c:b:70:1a:0:0:0:0:1:0:7a:f8:76:1d:1:2:dc:5:a9:fe:17:75:0:0:0:0:0:0:0:0:0:0:0:0:63:e5:0:0:0:0:0:0:10:0:0:0 2012-04-29 07:28:33.632578 : kjbcrc[0x15c91.1 76896.0][9] 2012-04-29 07:28:33.632616 : GSIPC:GMBQ: buff 0xba1e8f90, queue 0xbb79f278, pool 0x60013fa0, freeq 1, nxt 0xbb79f278, prv 0xbb79f278 2012-04-29 07:28:33.632634 : kjbmscrc(0x15c91.1)seq 0x2 reqid=0x1c(shadow 0xb4bb4458,reqid x1c)mas@2(infosz 200)(direct 1) 2012-04-29 07:28:33.632654 : kjbsentscn[0x0.12bbc1][to 2] 2012-04-29 07:28:33.632669 : GSIPC:SENDM: send msg 0xba1e9000 dest x20001 seq 24026 type 32 tkts xff0000 mlen x17001a0 2012-04-29 07:28:33.633385 : GSIPC:KSXPCB: msg 0xba1e9000 status 30, type 32, dest 2, rcvr 1 *** 2012-04-29 07:28:33.633 kclwcrs: wait=0 tm=689 *** 2012-04-29 07:28:33.633 kclwcrs: got 1 blocks from ksxprcv WAIT #140444679938584: nam='gc cr block 2-way' ela= 689 p1=1 p2=89233 p3=1 obj#=76896 tim=1335698913633418 2012-04-29 07:28:33.633490 : kjbcrcomplete[0x15c91.1 76896.0][0] 2012-04-29 07:28:33.633510 : kjbrcvdscn[0x0.12bbc1][from 2][idx 2012-04-29 07:28:33.633527 : kjbrcvdscn[no bscn <= rscn 0x0.12bbc1][from 2] *** 2012-04-29 07:28:33.633 kclwcrs: req=0 typ=cr(2) wtyp=2hop tm=689 ??TRACE???? ?????????TEST??????, ???????Dynamic Sampling?????,???????TEST?? CR???,???????’gc cr block 2-way’ ??: 2012-04-29 07:28:33.632654 : kjbsentscn[0x0.12bbc1][to 2] 12bbc1= 1227713  ???X$BH????CR???,kjbsentscn[0x0.12bbc1][to 2] ????? ? Instance 2 ???SCN=12bbc1=1227713   DBA=0x15c91.1 76896.0 ?  CR Request(obj#=76896) ??kjbrcvdscn????? [no bscn <= rscn 0x0.12bbc1][from 2] ,??? ??receive? SCN Version =12bbc1 ???Best Version CR Server Arch ??????????????????SELECT??: PARSING IN CURSOR #140444682869592 len=18 dep=0 uid=0 oct=3 lid=0 tim=1335698913635874 hv=1689401402 ad='b1a188f0' sqlid='c99yw1xkb4f1u' select * from test END OF STMT PARSE #140444682869592:c=4999,e=34017,p=0,cr=7,cu=0,mis=1,r=0,dep=0,og=1,plh=1357081020,tim=1335698913635870 EXEC #140444682869592:c=0,e=23,p=0,cr=0,cu=0,mis=0,r=0,dep=0,og=1,plh=1357081020,tim=1335698913635939 WAIT #140444682869592: nam='SQL*Net message to client' ela= 7 driver id=1650815232 #bytes=1 p3=0 obj#=0 tim=1335698913636071 *** 2012-04-29 07:28:33.636 kclscrs: req=0 block=1/89233 *** 2012-04-29 07:28:33.636 kclscrs: bid=1:3:1:0:7:83:1:0:4:0:0:0:1:2:4:1:26:0:0:0:70:1a:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:4:3:2:1:2:0:2:0:1c:86:2d:4:0:0:0:0:a2:3c:7c:b:70:1a:0:0:0:0:1:0:7d:f8:76:1d:1:2:dc:5:a9:fe:17:75:0:0:0:0:0:0:0:0:0:0:0:0:63:e5:0:0:0:0:0:0:10:0:0:0 2012-04-29 07:28:33.636209 : kjbcrc[0x15c91.1 76896.0][9] 2012-04-29 07:28:33.636228 : GSIPC:GMBQ: buff 0xba0e5d50, queue 0xbb79f278, pool 0x60013fa0, freeq 1, nxt 0xbb79f278, prv 0xbb79f278 2012-04-29 07:28:33.636244 : kjbmscrc(0x15c91.1)seq 0x3 reqid=0x1d(shadow 0xb4bb4458,reqid x1d)mas@2(infosz 200)(direct 1) 2012-04-29 07:28:33.636252 : kjbsentscn[0x0.12bbc4][to 2] 2012-04-29 07:28:33.636358 : GSIPC:SENDM: send msg 0xba0e5dc0 dest x20001 seq 24029 type 32 tkts xff0000 mlen x17001a0 2012-04-29 07:28:33.636861 : GSIPC:KSXPCB: msg 0xba0e5dc0 status 30, type 32, dest 2, rcvr 1 *** 2012-04-29 07:28:33.637 kclwcrs: wait=0 tm=865 *** 2012-04-29 07:28:33.637 kclwcrs: got 1 blocks from ksxprcv WAIT #140444682869592: nam='gc cr block 2-way' ela= 865 p1=1 p2=89233 p3=1 obj#=76896 tim=1335698913637294 2012-04-29 07:28:33.637356 : kjbcrcomplete[0x15c91.1 76896.0][0] 2012-04-29 07:28:33.637374 : kjbrcvdscn[0x0.12bbc4][from 2][idx 2012-04-29 07:28:33.637389 : kjbrcvdscn[no bscn <= rscn 0x0.12bbc4][from 2] *** 2012-04-29 07:28:33.637 kclwcrs: req=0 typ=cr(2) wtyp=2hop tm=865 ???, “SELECT * FROM TEST”??????’gc cr block 2-way’??:2012-04-29 07:28:33.637374 : kjbrcvdscn[0x0.12bbc4][from 2][idx 2012-04-29 07:28:33.637389 : kjbrcvdscn[no bscn ??Foreground Process? Remote LMS??got?? SCN=1227716 Version?CR, ??? ?????X$BH ?????scn??? ??????????Instance 1????2?SCN???CR?, ???????????Instance 1 Buffer Cache?? ??SCN Version ???CR ??????? ?????????: SQL> alter system set "_enable_minscn_cr"=false scope=spfile; System altered. SQL> alter system set "_db_block_max_cr_dba"=20 scope=spfile; System altered. SQL> startup force; ORA-32004: obsolete or deprecated parameter(s) specified for RDBMS instance ORACLE instance started. Total System Global Area 1570009088 bytes Fixed Size 2228704 bytes Variable Size 989859360 bytes Database Buffers 570425344 bytes Redo Buffers 7495680 bytes Database mounted. Database opened. ???? “_enable_minscn_cr”=false ? “_db_block_max_cr_dba”=20 ???RAC????, ??????: ?Instance 2 Session C ?update??????? ?????Instance 1 ????? ,????Instance 1?Request CR SQL> update test set id=id+1 where id=2; -- Instance 2 1 row updated. SQL> select * From test; -- Instance 1 ID ---------- 1 2 ??? Instance 1? X$BH?? select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;  STATE CR_SCN_BAS ---------- ---------- 3 1273080 3 1273071 3 1273041 3 1273039 8 0 SQL> update test set id=id+1 where id=3; 1 row updated. SQL> select * From test; ID ---------- 1 2 SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1273091 3 1273080 3 1273071 3 1273041 3 1273039 8 0 ................... SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1273793 3 1273782 3 1273780 3 1273769 3 1273734 3 1273715 3 1273691 3 1273679 3 1273670 3 1273643 3 1273635 3 1273623 3 1273106 3 1273091 3 1273080 3 1273071 3 1273041 3 1273039 3 1273033 19 rows selected. SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1274993 ????? ???? “_enable_minscn_cr”(enable/disable minscn optimization for CR)=false ? “_db_block_max_cr_dba”=20 (Maximum Allowed Number of CR buffers per dba) 2? ??? ????? Instance 1 ??????????? ?? 19????CR?? “_enable_minscn_cr”?11g??????????,???Oracle????CR????SCN,?Foreground Process Receive????????????(SCN??)?SCN Version CR Block??????CBC?? SCN??????CR? , ?????????Buffer Cache??????? ????SCN Version?CR????,????? ?????????? ?????Snap_Scn ?? SCN?? ?????????Current SCN, ??????CR??????????????????????, ????Buffer Cache? ?????????? CR?????????, ?????? “_db_block_max_cr_dba” ???????, ???????????20 ,??????Buffer Cache?????19????CR?; ???”_db_block_max_cr_dba” ???????6 , ?????Buffer cache????????CR ???????6?? ??”_enable_minscn_cr” ??CR???MINSCN ??????, ?????????CR???????, ????? Foreground Process??????CR Request , ?? Holder Instance LMS ?build?? BEST CR ??, ?????????

    Read the article

  • How to Load Oracle Tables From Hadoop Tutorial (Part 5 - Leveraging Parallelism in OSCH)

    - by Bob Hanckel
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Using OSCH: Beyond Hello World In the previous post we discussed a “Hello World” example for OSCH focusing on the mechanics of getting a toy end-to-end example working. In this post we are going to talk about how to make it work for big data loads. We will explain how to optimize an OSCH external table for load, paying particular attention to Oracle’s DOP (degree of parallelism), the number of external table location files we use, and the number of HDFS files that make up the payload. We will provide some rules that serve as best practices when using OSCH. The assumption is that you have read the previous post and have some end to end OSCH external tables working and now you want to ramp up the size of the loads. Using OSCH External Tables for Access and Loading OSCH external tables are no different from any other Oracle external tables.  They can be used to access HDFS content using Oracle SQL: SELECT * FROM my_hdfs_external_table; or use the same SQL access to load a table in Oracle. INSERT INTO my_oracle_table SELECT * FROM my_hdfs_external_table; To speed up the load time, you will want to control the degree of parallelism (i.e. DOP) and add two SQL hints. ALTER SESSION FORCE PARALLEL DML PARALLEL  8; ALTER SESSION FORCE PARALLEL QUERY PARALLEL 8; INSERT /*+ append pq_distribute(my_oracle_table, none) */ INTO my_oracle_table SELECT * FROM my_hdfs_external_table; There are various ways of either hinting at what level of DOP you want to use.  The ALTER SESSION statements above force the issue assuming you (the user of the session) are allowed to assert the DOP (more on that in the next section).  Alternatively you could embed additional parallel hints directly into the INSERT and SELECT clause respectively. /*+ parallel(my_oracle_table,8) *//*+ parallel(my_hdfs_external_table,8) */ Note that the "append" hint lets you load a target table by reserving space above a given "high watermark" in storage and uses Direct Path load.  In other doesn't try to fill blocks that are already allocated and partially filled. It uses unallocated blocks.  It is an optimized way of loading a table without incurring the typical resource overhead associated with run-of-the-mill inserts.  The "pq_distribute" hint in this context unifies the INSERT and SELECT operators to make data flow during a load more efficient. Finally your target Oracle table should be defined with "NOLOGGING" and "PARALLEL" attributes.   The combination of the "NOLOGGING" and use of the "append" hint disables REDO logging, and its overhead.  The "PARALLEL" clause tells Oracle to try to use parallel execution when operating on the target table. Determine Your DOP It might feel natural to build your datasets in Hadoop, then afterwards figure out how to tune the OSCH external table definition, but you should start backwards. You should focus on Oracle database, specifically the DOP you want to use when loading (or accessing) HDFS content using external tables. The DOP in Oracle controls how many PQ slaves are launched in parallel when executing an external table. Typically the DOP is something you want to Oracle to control transparently, but for loading content from Hadoop with OSCH, it's something that you will want to control. Oracle computes the maximum DOP that can be used by an Oracle user. The maximum value that can be assigned is an integer value typically equal to the number of CPUs on your Oracle instances, times the number of cores per CPU, times the number of Oracle instances. For example, suppose you have a RAC environment with 2 Oracle instances. And suppose that each system has 2 CPUs with 32 cores. The maximum DOP would be 128 (i.e. 2*2*32). In point of fact if you are running on a production system, the maximum DOP you are allowed to use will be restricted by the Oracle DBA. This is because using a system maximum DOP can subsume all system resources on Oracle and starve anything else that is executing. Obviously on a production system where resources need to be shared 24x7, this can’t be allowed to happen. The use cases for being able to run OSCH with a maximum DOP are when you have exclusive access to all the resources on an Oracle system. This can be in situations when your are first seeding tables in a new Oracle database, or there is a time where normal activity in the production database can be safely taken off-line for a few hours to free up resources for a big incremental load. Using OSCH on high end machines (specifically Oracle Exadata and Oracle BDA cabled with Infiniband), this mode of operation can load up to 15TB per hour. The bottom line is that you should first figure out what DOP you will be allowed to run with by talking to the DBAs who manage the production system. You then use that number to derive the number of location files, and (optionally) the number of HDFS data files that you want to generate, assuming that is flexible. Rule 1: Find out the maximum DOP you will be allowed to use with OSCH on the target Oracle system Determining the Number of Location Files Let’s assume that the DBA told you that your maximum DOP was 8. You want the number of location files in your external table to be big enough to utilize all 8 PQ slaves, and you want them to represent equally balanced workloads. Remember location files in OSCH are metadata lists of HDFS files and are created using OSCH’s External Table tool. They also represent the workload size given to an individual Oracle PQ slave (i.e. a PQ slave is given one location file to process at a time, and only it will process the contents of the location file.) Rule 2: The size of the workload of a single location file (and the PQ slave that processes it) is the sum of the content size of the HDFS files it lists For example, if a location file lists 5 HDFS files which are each 100GB in size, the workload size for that location file is 500GB. The number of location files that you generate is something you control by providing a number as input to OSCH’s External Table tool. Rule 3: The number of location files chosen should be a small multiple of the DOP Each location file represents one workload for one PQ slave. So the goal is to keep all slaves busy and try to give them equivalent workloads. Obviously if you run with a DOP of 8 but have 5 location files, only five PQ slaves will have something to do and the other three will have nothing to do and will quietly exit. If you run with 9 location files, then the PQ slaves will pick up the first 8 location files, and assuming they have equal work loads, will finish up about the same time. But the first PQ slave to finish its job will then be rescheduled to process the ninth location file, potentially doubling the end to end processing time. So for this DOP using 8, 16, or 32 location files would be a good idea. Determining the Number of HDFS Files Let’s start with the next rule and then explain it: Rule 4: The number of HDFS files should try to be a multiple of the number of location files and try to be relatively the same size In our running example, the DOP is 8. This means that the number of location files should be a small multiple of 8. Remember that each location file represents a list of unique HDFS files to load, and that the sum of the files listed in each location file is a workload for one Oracle PQ slave. The OSCH External Table tool will look in an HDFS directory for a set of HDFS files to load.  It will generate N number of location files (where N is the value you gave to the tool). It will then try to divvy up the HDFS files and do its best to make sure the workload across location files is as balanced as possible. (The tool uses a greedy algorithm that grabs the biggest HDFS file and delegates it to a particular location file. It then looks for the next biggest file and puts in some other location file, and so on). The tools ability to balance is reduced if HDFS file sizes are grossly out of balance or are too few. For example suppose my DOP is 8 and the number of location files is 8. Suppose I have only 8 HDFS files, where one file is 900GB and the others are 100GB. When the tool tries to balance the load it will be forced to put the singleton 900GB into one location file, and put each of the 100GB files in the 7 remaining location files. The load balance skew is 9 to 1. One PQ slave will be working overtime, while the slacker PQ slaves are off enjoying happy hour. If however the total payload (1600 GB) were broken up into smaller HDFS files, the OSCH External Table tool would have an easier time generating a list where each workload for each location file is relatively the same.  Applying Rule 4 above to our DOP of 8, we could divide the workload into160 files that were approximately 10 GB in size.  For this scenario the OSCH External Table tool would populate each location file with 20 HDFS file references, and all location files would have similar workloads (approximately 200GB per location file.) As a rule, when the OSCH External Table tool has to deal with more and smaller files it will be able to create more balanced loads. How small should HDFS files get? Not so small that the HDFS open and close file overhead starts having a substantial impact. For our performance test system (Exadata/BDA with Infiniband), I compared three OSCH loads of 1 TiB. One load had 128 HDFS files living in 64 location files where each HDFS file was about 8GB. I then did the same load with 12800 files where each HDFS file was about 80MB size. The end to end load time was virtually the same. However when I got ridiculously small (i.e. 128000 files at about 8MB per file), it started to make an impact and slow down the load time. What happens if you break rules 3 or 4 above? Nothing draconian, everything will still function. You just won’t be taking full advantage of the generous DOP that was allocated to you by your friendly DBA. The key point of the rules articulated above is this: if you know that HDFS content is ultimately going to be loaded into Oracle using OSCH, it makes sense to chop them up into the right number of files roughly the same size, derived from the DOP that you expect to use for loading. Next Steps So far we have talked about OLH and OSCH as alternative models for loading. That’s not quite the whole story. They can be used together in a way that provides for more efficient OSCH loads and allows one to be more flexible about scheduling on a Hadoop cluster and an Oracle Database to perform load operations. The next lesson will talk about Oracle Data Pump files generated by OLH, and loaded using OSCH. It will also outline the pros and cons of using various load methods.  This will be followed up with a final tutorial lesson focusing on how to optimize OLH and OSCH for use on Oracle's engineered systems: specifically Exadata and the BDA. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • Parsing Concerns

    - by Jesse
    If you’ve ever written an application that accepts date and/or time inputs from an external source (a person, an uploaded file, posted XML, etc.) then you’ve no doubt had to deal with parsing some text representing a date into a data structure that a computer can understand. Similarly, you’ve probably also had to take values from those same data structure and turn them back into their original formats. Most (all?) suitably modern development platforms expose some kind of parsing and formatting functionality for turning text into dates and vice versa. In .NET, the DateTime data structure exposes ‘Parse’ and ‘ToString’ methods for this purpose. This post will focus mostly on parsing, though most of the examples and suggestions below can also be applied to the ToString method. The DateTime.Parse method is pretty permissive in the values that it will accept (though apparently not as permissive as some other languages) which makes it pretty easy to take some text provided by a user and turn it into a proper DateTime instance. Here are some examples (note that the resulting DateTime values are shown using the RFC1123 format): DateTime.Parse("3/12/2010"); //Fri, 12 Mar 2010 00:00:00 GMT DateTime.Parse("2:00 AM"); //Sat, 01 Jan 2011 02:00:00 GMT (took today's date as date portion) DateTime.Parse("5-15/2010"); //Sat, 15 May 2010 00:00:00 GMT DateTime.Parse("7/8"); //Fri, 08 Jul 2011 00:00:00 GMT DateTime.Parse("Thursday, July 1, 2010"); //Thu, 01 Jul 2010 00:00:00 GMT Dealing With Inaccuracy While the DateTime struct has the ability to store a date and time value accurate down to the millisecond, most date strings provided by a user are not going to specify values with that much precision. In each of the above examples, the Parse method was provided a partial value from which to construct a proper DateTime. This means it had to go ahead and assume what you meant and fill in the missing parts of the date and time for you. This is a good thing, especially when we’re talking about taking input from a user. We can’t expect that every person using our software to provide a year, day, month, hour, minute, second, and millisecond every time they need to express a date. That said, it’s important for developers to understand what assumptions the software might be making and plan accordingly. I think the assumptions that were made in each of the above examples were pretty reasonable, though if we dig into this method a little bit deeper we’ll find that there are a lot more assumptions being made under the covers than you might have previously known. One of the biggest assumptions that the DateTime.Parse method has to make relates to the format of the date represented by the provided string. Let’s consider this example input string: ‘10-02-15’. To some people. that might look like ‘15-Feb-2010’. To others, it might be ‘02-Oct-2015’. Like many things, it depends on where you’re from. This Is America! Most cultures around the world have adopted a “little-endian” or “big-endian” formats. (Source: Date And Time Notation By Country) In this context,  a “little-endian” date format would list the date parts with the least significant first while the “big-endian” date format would list them with the most significant first. For example, a “little-endian” date would be “day-month-year” and “big-endian” would be “year-month-day”. It’s worth nothing here that ISO 8601 defines a “big-endian” format as the international standard. While I personally prefer “big-endian” style date formats, I think both styles make sense in that they follow some logical standard with respect to ordering the date parts by their significance. Here in the United States, however, we buck that trend by using what is, in comparison, a completely nonsensical format of “month/day/year”. Almost no other country in the world uses this format. I’ve been fortunate in my life to have done some international travel, so I’ve been aware of this difference for many years, but never really thought much about it. Until recently, I had been developing software for exclusively US-based audiences and remained blissfully ignorant of the different date formats employed by other countries around the world. The web application I work on is being rolled out to users in different countries, so I was recently tasked with updating it to support different date formats. As it turns out, .NET has a great mechanism for dealing with different date formats right out of the box. Supporting date formats for different cultures is actually pretty easy once you understand this mechanism. Pulling the Curtain Back On the Parse Method Have you ever taken a look at the different flavors (read: overloads) that the DateTime.Parse method comes in? In it’s simplest form, it takes a single string parameter and returns the corresponding DateTime value (if it can divine what the date value should be). You can optionally provide two additional parameters to this method: an ‘System.IFormatProvider’ and a ‘System.Globalization.DateTimeStyles’. Both of these optional parameters have some bearing on the assumptions that get made while parsing a date, but for the purposes of this article I’m going to focus on the ‘System.IFormatProvider’ parameter. The IFormatProvider exposes a single method called ‘GetFormat’ that returns an object to be used for determining the proper format for displaying and parsing things like numbers and dates. This interface plays a big role in the globalization capabilities that are built into the .NET Framework. The cornerstone of these globalization capabilities can be found in the ‘System.Globalization.CultureInfo’ class. To put it simply, the CultureInfo class is used to encapsulate information related to things like language, writing system, and date formats for a certain culture. Support for many cultures are “baked in” to the .NET Framework and there is capacity for defining custom cultures if needed (thought I’ve never delved into that). While the details of the CultureInfo class are beyond the scope of this post, so for now let me just point out that the CultureInfo class implements the IFormatInfo interface. This means that a CultureInfo instance created for a given culture can be provided to the DateTime.Parse method in order to tell it what date formats it should expect. So what happens when you don’t provide this value? Let’s crack this method open in Reflector: When no IFormatInfo parameter is provided (i.e. we use the simple DateTime.Parse(string) overload), the ‘DateTimeFormatInfo.CurrentInfo’ is used instead. Drilling down a bit further we can see the implementation of the DateTimeFormatInfo.CurrentInfo property: From this property we can determine that, in the absence of an IFormatProvider being specified, the DateTime.Parse method will assume that the provided date should be treated as if it were in the format defined by the CultureInfo object that is attached to the current thread. The culture specified by the CultureInfo instance on the current thread can vary depending on several factors, but if you’re writing an application where a single instance might be used by people from different cultures (i.e. a web application with an international user base), it’s important to know what this value is. Having a solid strategy for setting the current thread’s culture for each incoming request in an internationally used ASP .NET application is obviously important, and might make a good topic for a future post. For now, let’s think about what the implications of not having the correct culture set on the current thread. Let’s say you’re running an ASP .NET application on a server in the United States. The server was setup by English speakers in the United States, so it’s configured for US English. It exposes a web page where users can enter order data, one piece of which is an anticipated order delivery date. Most users are in the US, and therefore enter dates in a ‘month/day/year’ format. The application is using the DateTime.Parse(string) method to turn the values provided by the user into actual DateTime instances that can be stored in the database. This all works fine, because your users and your server both think of dates in the same way. Now you need to support some users in South America, where a ‘day/month/year’ format is used. The best case scenario at this point is a user will enter March 13, 2011 as ‘25/03/2011’. This would cause the call to DateTime.Parse to blow up since that value doesn’t look like a valid date in the US English culture (Note: In all likelihood you might be using the DateTime.TryParse(string) method here instead, but that method behaves the same way with regard to date formats). “But wait a minute”, you might be saying to yourself, “I thought you said that this was the best case scenario?” This scenario would prevent users from entering orders in the system, which is bad, but it could be worse! What if the order needs to be delivered a day earlier than that, on March 12, 2011? Now the user enters ‘12/03/2011’. Now the call to DateTime.Parse sees what it thinks is a valid date, but there’s just one problem: it’s not the right date. Now this order won’t get delivered until December 3, 2011. In my opinion, that kind of data corruption is a much bigger problem than having the Parse call fail. What To Do? My order entry example is a bit contrived, but I think it serves to illustrate the potential issues with accepting date input from users. There are some approaches you can take to make this easier on you and your users: Eliminate ambiguity by using a graphical date input control. I’m personally a fan of a jQuery UI Datepicker widget. It’s pretty easy to setup, can be themed to match the look and feel of your site, and has support for multiple languages and cultures. Be sure you have a way to track the culture preference of each user in your system. For a web application this could be done using something like a cookie or session state variable. Ensure that the current user’s culture is being applied correctly to DateTime formatting and parsing code. This can be accomplished by ensuring that each request has the handling thread’s CultureInfo set properly, or by using the Format and Parse method overloads that accept an IFormatProvider instance where the provided value is a CultureInfo object constructed using the current user’s culture preference. When in doubt, favor formats that are internationally recognizable. Using the string ‘2010-03-05’ is likely to be recognized as March, 5 2011 by users from most (if not all) cultures. Favor standard date format strings over custom ones. So far we’ve only talked about turning a string into a DateTime, but most of the same “gotchas” apply when doing the opposite. Consider this code: someDateValue.ToString("MM/dd/yyyy"); This will output the same string regardless of what the current thread’s culture is set to (with the exception of some cultures that don’t use the Gregorian calendar system, but that’s another issue all together). For displaying dates to users, it would be better to do this: someDateValue.ToString("d"); This standard format string of “d” will use the “short date format” as defined by the culture attached to the current thread (or provided in the IFormatProvider instance in the proper method overload). This means that it will honor the proper month/day/year, year/month/day, or day/month/year format for the culture. Knowing Your Audience The examples and suggestions shown above can go a long way toward getting an application in shape for dealing with date inputs from users in multiple cultures. There are some instances, however, where taking approaches like these would not be appropriate. In some cases, the provider or consumer of date values that pass through your application are not people, but other applications (or other portions of your own application). For example, if your site has a page that accepts a date as a query string parameter, you’ll probably want to format that date using invariant date format. Otherwise, the same URL could end up evaluating to a different page depending on the user that is viewing it. In addition, if your application exports data for consumption by other systems, it’s best to have an agreed upon format that all systems can use and that will not vary depending upon whether or not the users of the systems on either side prefer a month/day/year or day/month/year format. I’ll look more at some approaches for dealing with these situations in a future post. If you take away one thing from this post, make it an understanding of the importance of knowing where the dates that pass through your system come from and are going to. You will likely want to vary your parsing and formatting approach depending on your audience.

    Read the article

  • ?Oracle Database 12c????Information Lifecycle Management ILM?Storage Enhancements

    - by Liu Maclean(???)
    Oracle Database 12c????Information Lifecycle Management ILM ?????????Storage Enhancements ???????? Lifecycle Management ILM ????????? Automatic Data Placement ??????, ??ADP? ?????? 12c???????Datafile??? Online Move Datafile, ????????????????datafile???????,??????????????? ????(12.1.0.1)Automatic Data Optimization?heat map????????: ????????? (CDB)?????Automatic Data Optimization?heat map Row-level policies for ADO are not supported for Temporal Validity. Partition-level ADO and compression are supported if partitioned on the end-time columns. Row-level policies for ADO are not supported for in-database archiving. Partition-level ADO and compression are supported if partitioned on the ORA_ARCHIVE_STATE column. Custom policies (user-defined functions) for ADO are not supported if the policies default at the tablespace level. ADO does not perform checks for storage space in a target tablespace when using storage tiering. ADO is not supported on tables with object types or materialized views. ADO concurrency (the number of simultaneous policy jobs for ADO) depends on the concurrency of the Oracle scheduler. If a policy job for ADO fails more than two times, then the job is marked disabled and the job must be manually enabled later. Policies for ADO are only run in the Oracle Scheduler maintenance windows. Outside of the maintenance windows all policies are stopped. The only exceptions are those jobs for rebuilding indexes in ADO offline mode. ADO has restrictions related to moving tables and table partitions. ??????row,segment???????????ADO??,?????create table?alter table?????? ????ADO??,??????????????,???????????????? storage tier , ?????????storage tier?????????, ??????????????ADO??????????? segment?row??group? ?CREATE TABLE?ALERT TABLE???ILM???,??????????????????ADO policy? ??ILM policy???????????????? ??????? ????ADO policy, ?????alter table  ???????,?????????????? CREATE TABLE sales_ado (PROD_ID NUMBER NOT NULL, CUST_ID NUMBER NOT NULL, TIME_ID DATE NOT NULL, CHANNEL_ID NUMBER NOT NULL, PROMO_ID NUMBER NOT NULL, QUANTITY_SOLD NUMBER(10,2) NOT NULL, AMOUNT_SOLD NUMBER(10,2) NOT NULL ) ILM ADD POLICY COMPRESS FOR ARCHIVE HIGH SEGMENT AFTER 6 MONTHS OF NO ACCESS; SQL> SELECT SUBSTR(policy_name,1,24) AS POLICY_NAME, policy_type, enabled 2 FROM USER_ILMPOLICIES; POLICY_NAME POLICY_TYPE ENABLED -------------------- -------------------------- -------------- P41 DATA MOVEMENT YES ALTER TABLE sales MODIFY PARTITION sales_1995 ILM ADD POLICY COMPRESS FOR ARCHIVE HIGH SEGMENT AFTER 6 MONTHS OF NO ACCESS; SELECT SUBSTR(policy_name,1,24) AS POLICY_NAME, policy_type, enabled FROM USER_ILMPOLICIES; POLICY_NAME POLICY_TYPE ENABLE ------------------------ ------------- ------ P1 DATA MOVEMENT YES P2 DATA MOVEMENT YES /* You can disable an ADO policy with the following */ ALTER TABLE sales_ado ILM DISABLE POLICY P1; /* You can delete an ADO policy with the following */ ALTER TABLE sales_ado ILM DELETE POLICY P1; /* You can disable all ADO policies with the following */ ALTER TABLE sales_ado ILM DISABLE_ALL; /* You can delete all ADO policies with the following */ ALTER TABLE sales_ado ILM DELETE_ALL; /* You can disable an ADO policy in a partition with the following */ ALTER TABLE sales MODIFY PARTITION sales_1995 ILM DISABLE POLICY P2; /* You can delete an ADO policy in a partition with the following */ ALTER TABLE sales MODIFY PARTITION sales_1995 ILM DELETE POLICY P2; ILM ???????: ?????ILM ADP????,???????: ?????? ???? activity tracking, ????2????????,???????????????????: SEGMENT-LEVEL???????????????????? ROW-LEVEL????????,??????? ????????: 1??????? SEGMENT-LEVEL activity tracking ALTER TABLE interval_sales ILM  ENABLE ACTIVITY TRACKING SEGMENT ACCESS ???????INTERVAL_SALES??segment level  activity tracking,?????????????????? 2? ??????????? ALTER TABLE emp ILM ENABLE ACTIVITY TRACKING (CREATE TIME , WRITE TIME); 3????????? ALTER TABLE emp ILM ENABLE ACTIVITY TRACKING  (READ TIME); ?12.1.0.1.0?????? ??HEAT_MAP??????????, ?????system??session?????heap_map????????????? ?????????HEAT MAP??,? ALTER SYSTEM SET HEAT_MAP = ON; ?HEAT MAP??????,??????????????????????????  ??SYSTEM?SYSAUX????????????? ???????HEAT MAP??: ALTER SYSTEM SET HEAT_MAP = OFF; ????? HEAT_MAP????, ?HEAT_MAP??? ?????????????????????? ?HEAT_MAP?????????Automatic Data Optimization (ADO)??? ??ADO??,Heat Map ?????????? ????V$HEAT_MAP_SEGMENT ??????? HEAT MAP?? SQL> select * from V$heat_map_segment; no rows selected SQL> alter session set heat_map=on; Session altered. SQL> select * from scott.emp; EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO ---------- ---------- --------- ---------- --------- ---------- ---------- ---------- 7369 SMITH CLERK 7902 17-DEC-80 800 20 7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30 7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30 7566 JONES MANAGER 7839 02-APR-81 2975 20 7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30 7698 BLAKE MANAGER 7839 01-MAY-81 2850 30 7782 CLARK MANAGER 7839 09-JUN-81 2450 10 7788 SCOTT ANALYST 7566 19-APR-87 3000 20 7839 KING PRESIDENT 17-NOV-81 5000 10 7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30 7876 ADAMS CLERK 7788 23-MAY-87 1100 20 7900 JAMES CLERK 7698 03-DEC-81 950 30 7902 FORD ANALYST 7566 03-DEC-81 3000 20 7934 MILLER CLERK 7782 23-JAN-82 1300 10 14 rows selected. SQL> select * from v$heat_map_segment; OBJECT_NAME SUBOBJECT_NAME OBJ# DATAOBJ# TRACK_TIM SEG SEG FUL LOO CON_ID -------------------- -------------------- ---------- ---------- --------- --- --- --- --- ---------- EMP 92997 92997 23-JUL-13 NO NO YES NO 0 ??v$heat_map_segment???,?v$heat_map_segment??????????????X$HEATMAPSEGMENT V$HEAT_MAP_SEGMENT displays real-time segment access information. Column Datatype Description OBJECT_NAME VARCHAR2(128) Name of the object SUBOBJECT_NAME VARCHAR2(128) Name of the subobject OBJ# NUMBER Object number DATAOBJ# NUMBER Data object number TRACK_TIME DATE Timestamp of current activity tracking SEGMENT_WRITE VARCHAR2(3) Indicates whether the segment has write access: (YES or NO) SEGMENT_READ VARCHAR2(3) Indicates whether the segment has read access: (YES or NO) FULL_SCAN VARCHAR2(3) Indicates whether the segment has full table scan: (YES or NO) LOOKUP_SCAN VARCHAR2(3) Indicates whether the segment has lookup scan: (YES or NO) CON_ID NUMBER The ID of the container to which the data pertains. Possible values include:   0: This value is used for rows containing data that pertain to the entire CDB. This value is also used for rows in non-CDBs. 1: This value is used for rows containing data that pertain to only the root n: Where n is the applicable container ID for the rows containing data The Heat Map feature is not supported in CDBs in Oracle Database 12c, so the value in this column can be ignored. ??HEAP MAP??????????????????,????DBA_HEAT_MAP_SEGMENT???????? ???????HEAT_MAP_STAT$?????? ??Automatic Data Optimization??????: ????1: SQL> alter system set heat_map=on; ?????? ????????????? scott?? http://www.askmaclean.com/archives/scott-schema-script.html SQL> grant all on dbms_lock to scott; ????? SQL> grant dba to scott; ????? @ilm_setup_basic C:\APP\XIANGBLI\ORADATA\MACLEAN\ilm.dbf @tktgilm_demo_env_setup SQL> connect scott/tiger ; ???? SQL> select count(*) from scott.employee; COUNT(*) ---------- 3072 ??? 1 ?? SQL> set serveroutput on SQL> exec print_compression_stats('SCOTT','EMPLOYEE'); Compression Stats ------------------ Uncmpressed : 3072 Adv/basic compressed : 0 Others : 0 PL/SQL ???????? ???????3072?????? ????????? ????policy ???????????? alter table employee ilm add policy row store compress advanced row after 3 days of no modification / SQL> set serveroutput on SQL> execute list_ilm_policies; -------------------------------------------------- Policies defined for SCOTT -------------------------------------------------- Object Name------ : EMPLOYEE Subobject Name--- : Object Type------ : TABLE Inherited from--- : POLICY NOT INHERITED Policy Name------ : P1 Action Type------ : COMPRESSION Scope------------ : ROW Compression level : ADVANCED Tier Tablespace-- : Condition type--- : LAST MODIFICATION TIME Condition days--- : 3 Enabled---------- : YES -------------------------------------------------- PL/SQL ???????? SQL> select sysdate from dual; SYSDATE -------------- 29-7? -13 SQL> execute set_back_chktime(get_policy_name('EMPLOYEE',null,'COMPRESSION','ROW','ADVANCED',3,null,null),'EMPLOYEE',null,6); Object check time reset ... -------------------------------------- Object Name : EMPLOYEE Object Number : 93123 D.Object Numbr : 93123 Policy Number : 1 Object chktime : 23-7? -13 08.13.42.000000 ?? Distnt chktime : 0 -------------------------------------- PL/SQL ???????? ?policy?chktime???6??, ????set_back_chktime???????????????“????”?,?????????,???????? ?????? alter system flush buffer_cache; alter system flush buffer_cache; alter system flush shared_pool; alter system flush shared_pool; SQL> execute set_window('MONDAY_WINDOW','OPEN'); Set Maint. Window OPEN ----------------------------- Window Name : MONDAY_WINDOW Enabled? : TRUE Active? : TRUE ----------------------------- PL/SQL ???????? SQL> exec dbms_lock.sleep(60) ; PL/SQL ???????? SQL> exec print_compression_stats('SCOTT', 'EMPLOYEE'); Compression Stats ------------------ Uncmpressed : 338 Adv/basic compressed : 2734 Others : 0 PL/SQL ???????? ??????????????? Adv/basic compressed : 2734 ??????? SQL> col object_name for a20 SQL> select object_id,object_name from dba_objects where object_name='EMPLOYEE'; OBJECT_ID OBJECT_NAME ---------- -------------------- 93123 EMPLOYEE SQL> execute list_ilm_policy_executions ; -------------------------------------------------- Policies execution details for SCOTT -------------------------------------------------- Policy Name------ : P22 Job Name--------- : ILMJOB48 Start time------- : 29-7? -13 08.37.45.061000 ?? End time--------- : 29-7? -13 08.37.48.629000 ?? ----------------- Object Name------ : EMPLOYEE Sub_obj Name----- : Obj Type--------- : TABLE ----------------- Exec-state------- : SELECTED FOR EXECUTION Job state-------- : COMPLETED SUCCESSFULLY Exec comments---- : Results comments- : --- -------------------------------------------------- PL/SQL ???????? ILMJOB48?????policy?JOB,?12.1.0.1??J00x???? ?MMON_SLAVE???M00x???15????????? select sample_time,program,module,action from v$active_session_history where action ='KDILM background EXEcution' order by sample_time; 29-7? -13 08.16.38.369000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.17.38.388000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.17.39.390000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.23.38.681000000 ?? ORACLE.EXE (M002) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.32.38.968000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.33.39.993000000 ?? ORACLE.EXE (M003) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.33.40.993000000 ?? ORACLE.EXE (M003) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.36.40.066000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.37.42.258000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.37.43.258000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.37.44.258000000 ?? ORACLE.EXE (M000) MMON_SLAVE KDILM background EXEcution 29-7? -13 08.38.42.386000000 ?? ORACLE.EXE (M001) MMON_SLAVE KDILM background EXEcution select distinct action from v$active_session_history where action like 'KDILM%' KDILM background CLeaNup KDILM background EXEcution SQL> execute set_window('MONDAY_WINDOW','CLOSE'); Set Maint. Window CLOSE ----------------------------- Window Name : MONDAY_WINDOW Enabled? : TRUE Active? : FALSE ----------------------------- PL/SQL ???????? SQL> drop table employee purge ; ????? ???? ????? spool ilm_usecase_1_cleanup.lst @ilm_demo_cleanup ; spool off

    Read the article

  • Application Specific Paths for DLL Loading when DLL is loaded dynamically

    - by MartinHT
    Hi: I am building a program that uses a very simple plugin system. This is the code I'm using to load the possible plugins: public interface IPlugin { string Name { get; } string Description { get; } bool Execute(System.Windows.Forms.IWin32Window parent); } private void loadPlugins() { int idx = 0; string[] pluginFolders = getPluginFolders(); Array.ForEach(pluginFolders, folder => { string[] pluginFiles = getPluginFiles(folder); Array.ForEach(pluginFiles, file => { try { System.Reflection.Assembly assembly = System.Reflection.Assembly.LoadFile(file); Array.ForEach(assembly.GetTypes(), type => { if(type.GetInterface("PluginExecutor.IPlugin") != null) { IPlugin plugin = assembly.CreateInstance(type.ToString()) as IPlugin; if(plugin != null) lista.Add(new PluginItem(plugin.Name, plugin.Description, file, plugin)); } }); } catch(Exception) { } }); }); } When the user selects a particular plugin from the list, I launch the plugin's Execute method. So far, so good! As you can see the plugins are loaded from a folder, and within the folder are several dll's that are needed but the plugin. My problem is that I can't get the plugin to 'see' the dlls, it just searches the launching applications startup folder, but not the folder where the plugin was loaded from. I have tried several methods: 1. Changing the Current Directory to the plugins folder. 2. Using an inter-op call to SetDllDirectory 3. Adding an entry in the registry to point to a folder where I want it to look (see code below) None of these methods work. What am I missing? As I load the dll plugin dynamically, it does not seem to obey any of the above mentioned methods. What else can I try? Regards, MartinH. //HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\App Paths Microsoft.Win32.RegistryKey appPaths = Microsoft.Win32.Registry.LocalMachine.CreateSubKey( string.Format( @"SOFTWARE\Microsoft\Windows\CurrentVersion\App Paths\{0}", System.IO.Path.GetFileName(Application.ExecutablePath)), Microsoft.Win32.RegistryKeyPermissionCheck.ReadWriteSubTree); appPaths.SetValue(string.Empty, Application.ExecutablePath); object path = appPaths.GetValue("Path"); if(path == null) appPaths.SetValue("Path", System.IO.Path.GetDirectoryName(pluginItem.FileName)); else { string strPath = string.Format("{0};{1}", path, System.IO.Path.GetDirectoryName(pluginItem.FileName)); appPaths.SetValue("Path", strPath); } appPaths.Flush();

    Read the article

  • WIX installer with Custom Actions: "built by a runtime newer than the currently loaded runtime and cannot be loaded."

    - by Rimer
    I have a WIX installer that executes Custom Actions over the course of install. When I run the WIX installer, and it encounters its first Custom Action, the installer fails out, and I receive an error in the MSI log as follows: Action start 12:03:53: LoadBCAConfigDefaults. SFXCA: Extracting custom action to temporary directory: C:\DOCUME~1\ELOY06~1\LOCALS~1\Temp\MSI10C.tmp-\ SFXCA: Binding to CLR version v2.0.50727 Calling custom action WIXCustomActions!WIXCustomActions.CustomActions.LoadBCAConfigDefaults Error: could not load custom action class WIXCustomActions.CustomActions from assembly: WIXCustomActions System.BadImageFormatException: Could not load file or assembly 'WIXCustomActions' or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded. File name: 'WIXCustomActions' at System.Reflection.Assembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, Assembly locationHint, StackCrawlMark& stackMark, Boolean throwOnFileNotFound, Boolean forIntrospection) at System.Reflection.Assembly.nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, Assembly locationHint, StackCrawlMark& stackMark, Boolean throwOnFileNotFound, Boolean forIntrospection) at System.Reflection.Assembly.InternalLoad(AssemblyName assemblyRef, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) at System.Reflection.Assembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) at System.AppDomain.Load(String assemblyString) at Microsoft.Deployment.WindowsInstaller.CustomActionProxy.GetCustomActionMethod(Session session, String assemblyName, String className, String methodName) ... the specific problem from above is "System.BadImageFormatException: Could not load file or assembly 'WIXCustomActions' or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded." The wordage of that error seems to indicate something like an incorrectly referenced .NET framework or something (I'm targeting 3.5 in both my custom actions and its dependencies), but I can't figure out where to make a change to address this problem. Any ideas? .... Not sure if this will help but it's the CustomActions package batch file I run to create the .dll package containing the custom action functions: =============== call "C:\Program Files\Microsoft Visual Studio 10.0\VC\vcvarsall.bat" @echo on cd "C:\development\trunk\PortalsDev\csharp\production\Installers\WIX\customactions\PAServicesWIXCustomActions" csc /target:library /r:"C:\program files\windows installer xml v3.6\sdk\microsoft.deployment.windowsinstaller.dll" /r:"C:\development\trunk\PortalsDev\csharp\production\Installers\WIX\customactions\PAServicesWIXCustomActions\bin\Debug\eLoyalty.PortalLib.dll" /out:"C:\development\trunk\PortalsDev\csharp\production\Installers\WIX\customactions\PAServicesWIXCustomActions\bin\Debug\WIXCustomActions.dll" CustomActions.cs cd "C:\Program Files\Windows Installer XML v3.6\SDK" makesfxca "C:\development\trunk\PortalsDev\csharp\production\Installers\WIX\customactions\PAServicesWIXCustomActions\bin\Debug\BatchCustomerAnalysisWIXCustomActionsPackage.dll" "c:\program files\windows installer xml v3.6\sdk\x86\sfxca.dll" "C:\development\trunk\PortalsDev\csharp\production\Installers\WIX\customactions\PAServicesWIXCustomActions\bin\Debug\WIXCustomActions.dll" customaction.config Microsoft.Deployment.WindowsInstaller.dll

    Read the article

  • Android Mediaplayer: setDataSource issue for downloaded media file

    - by Erik
    I have an application that will record and play audio files. Some of the audio files are downloaded using simple standard http downloads using httpclient. It worked like a charm for a long time. Now all of a sudden I cannot play the files I download. It fails with this stack. I store the files on the SDCard and I experience the problem both on a handset and a USB connected device. I have checked that the downloaded file is cool on the server, and I can play it without any issues. These are the code snippets I use ( I know that recordingFile is a valid path for the file). // inside the activity class private void playRecording() throws IOException{ File recordingFile = new File(recordingFileName); FileInputStream recordingInputStream = new FileInputStream(recordingFile); audioMediaPlayer.playAudio(recordingInputStream); } Here is the media player code: // inside my media player class which handles the recordings public void playAudio(FileInputStream audioInputStream) throws IOException { mediaPlayer.reset(); mediaPlayer.setDataSource(audioInputStream.getFD()); mediaPlayer.prepare(); mediaPlayer.start(); } Here is the exception: E/MediaPlayerService( 555): offset error E/MediaPlayer( 786): Unable to to create media player W/System.err( 786): java.io.IOException: setDataSourceFD failed.: status=0x80000000 W/System.err( 786): at android.media.MediaPlayer.setDataSource(Native Method) W/System.err( 786): at android.media.MediaPlayer.setDataSource(MediaPlayer.java:632) W/System.err( 786): at net.xxx.xxx.AudioMediaPlayer.playAudio(AudioMediaPlayer.java:69) W/System.err( 786): at net.xxx.xxx.Downloads.playRecording(Downloads.java:299) W/System.err( 786): at net.xxx.xxx.Downloads.access$0(Downloads.java:294) W/System.err( 786): at net.xxx.xxx.Downloads$1.onClick(Downloads.java:135) I have tried seeking some answer of the offset error, but not really clear what this issue might be. PS I download the file with this code: public FileOutputStream executeHttpGet(FileOutputStream fileOutputStream) throws ClientProtocolException, IOException{ try { // Execute HTTP Post Request httpResponse = httpClient.execute(httpPost, localContext); int status = httpResponse.getStatusLine().getStatusCode(); // we assume that the response body contains the error message if (status != HttpStatus.SC_OK) { ByteArrayOutputStream ostream = new ByteArrayOutputStream(); httpResponse.getEntity().writeTo(ostream); fileOutputStream = null; } else { InputStream content = httpResponse.getEntity().getContent(); byte[] buffer = new byte[1024]; int len = 0; while ( (len = content.read(buffer)) > 0 ) { fileOutputStream.write(buffer,0, len); } fileOutputStream.close(); content.close(); // this will also close the connection } } catch (ClientProtocolException e1) { // TODO Auto-generated catch block e1.printStackTrace(); fileOutputStream = null; } catch (IOException e2) { // TODO Auto-generated catch block e2.printStackTrace(); fileOutputStream = null; } return fileOutputStream; }

    Read the article

  • Secondary monitor bug: a problem in WPF or in the graphics driver?

    - by emddudley
    I have discovered a strange bug with my WPF application and I am trying to determine whether it is a problem with WPF or my graphics driver so that I can report it to the appropriate company. I have a Quadro FX 1700 with the latest drivers (197.54) on a Windows XP system, running a .NET 3.5 SP1 application. I have dual monitors, and when I maximize then minimize a child window of the main window on my primary monitor, the child window gets drawn on the secondary monitor as well. It appears in both places. I made a sample application (code is below) which induces this behavior. Start the application and ensure the main window is on your primary monitor. Double-click the main window. A green child window should appear. Click the green child window to maximize. Click the green child window to minimize. Can anyone else reproduce this problem? On my system the green child restores, but then it's drawn on both my primary and secondary monitors, rather than just the primary monitor. App.xaml <Application xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" x:Class="DualMonitorBug.App" StartupUri="Shell.xaml" /> App.xaml.cs using System.Windows; namespace DualMonitorBug { public partial class App : Application { } } Shell.xaml <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" x:Class="DualMonitorBug.Shell" Title="Shell" Height="480" Width="640" MouseDoubleClick="ShowDialog" /> Shell.xaml.cs using System.Windows; using System.Windows.Input; namespace DualMonitorBug { public partial class Shell : Window { public Shell() { InitializeComponent(); } private void ShowDialog(object sender, MouseButtonEventArgs e) { DialogWindow dialog = new DialogWindow(); dialog.Owner = this; dialog.Show(); } } } DialogWindow.xaml <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" x:Class="DualMonitorBug.DialogWindow" Title="Dialog Window" Height="240" Width="320" AllowsTransparency="True" Background="Green" MouseLeftButtonDown="ShowHideDialog" WindowStyle="None" /> DialogWindow.xaml.cs using System.Windows; using System.Windows.Input; namespace DualMonitorBug { public partial class DialogWindow : Window { public DialogWindow() { InitializeComponent(); } private void ShowHideDialog(object sender, MouseButtonEventArgs e) { this.WindowState = (this.WindowState == WindowState.Normal) ? WindowState.Maximized : WindowState.Normal; } } }

    Read the article

  • Resgen al.exe generated resources do not work within .net library

    - by Raj G
    Hi, I am currently working on a library in .Net and I planned to make the strings that are used within the library into culture specific resource files. I made Resources.resx, Resources.en-US.resx and Resources.ja-JP.resx file. I also deleted the Resources.designer.cs file autogenerated by visual studio 2008. I am loading Resources through my custom ResourceManager object [using GetString method]. The problem that I am facing is that when I compile the library within visual studio and set the culture from the calling application, everything is working fine. But if I manually go to the directory and change a string for a culture and regenerate the satellite assembly with resgen and al.exe, the string displayed, falls back to the invariant culture. I have attached the ildasm view of both the dlls en-US generated from within visual studio //Metadata version: v2.0.50727 .assembly extern mscorlib { .publickeytoken = (B7 7A 5C 56 19 34 E0 89 ) // .z\V.4.. .hash = (71 05 4D 54 C4 8D C2 90 7D 8B CF 57 2E B5 98 22 // q.MT....}..W..." F5 5B 2E 06 ) // .[.. .ver 2:0:0:0 } .assembly EmailEngine.resources { .custom instance void [mscorlib]System.Reflection.AssemblyTitleAttribute::.ctor(string) = ( 01 00 0B 45 6D 61 69 6C 45 6E 67 69 6E 65 00 00 ) // ...EmailEngine.. .custom instance void [mscorlib]System.Reflection.AssemblyDescriptionAttribute::.ctor(string) = ( 01 00 FF 00 00 ) .custom instance void [mscorlib]System.Reflection.AssemblyCompanyAttribute::.ctor(string) = ( 01 00 FF 00 00 ) .custom instance void [mscorlib]System.Reflection.AssemblyProductAttribute::.ctor(string) = ( 01 00 0B 45 6D 61 69 6C 45 6E 67 69 6E 65 00 00 ) // ...EmailEngine.. .custom instance void [mscorlib]System.Reflection.AssemblyCopyrightAttribute::.ctor(string) = ( 01 00 12 43 6F 70 79 72 69 67 68 74 20 C2 A9 20 // ...Copyright .. 20 32 30 30 38 00 00 ) // 2008.. .custom instance void [mscorlib]System.Reflection.AssemblyTrademarkAttribute::.ctor(string) = ( 01 00 FF 00 00 ) .custom instance void [mscorlib]System.Reflection.AssemblyFileVersionAttribute::.ctor(string) = ( 01 00 07 31 2E 30 2E 30 2E 30 00 00 ) // ...1.0.0.0.. .hash algorithm 0x00008004 .ver 1:0:0:0 .locale = (65 00 6E 00 2D 00 55 00 53 00 00 00 ) // e.n.-.U.S... } .mresource public 'EmailEngine.Properties.Resources.en-US.resources' { // Offset: 0x00000000 Length: 0x00000111 } .module EmailEngine.resources.dll // MVID: {D030D620-4E59-46F4-94F4-5EA0F9554E67} .imagebase 0x00400000 .file alignment 0x00000200 .stackreserve 0x00100000 .subsystem 0x0003 // WINDOWS_CUI .corflags 0x00000001 // ILONLY // Image base: 0x008B0000 ja-JP generated by me using resgen and al.exe // Metadata version: v2.0.50727 .assembly EmailEngine.resources { .hash algorithm 0x00008004 .ver 0:0:0:0 .locale = (6A 00 61 00 00 00 ) // j.a... } .mresource public 'EmailEngine.Properties.Resources.ja-JP.resources' { // Offset: 0x00000000 Length: 0x0000012F } .module EmailEngine.resources.dll // MVID: {0F470BCD-C36D-4B9F-A8ED-205A0E5A9F6F} .imagebase 0x00400000 .file alignment 0x00000200 .stackreserve 0x00100000 .subsystem 0x0003 // WINDOWS_CUI .corflags 0x00000001 // ILONLY // Image base: 0x007F0000 Can anyone help me as to why these two files are different and what is going on here? Why would the same Japanese resource file work when generated from within visual studio and not when generated using tools. TIA Raj

    Read the article

  • SSIS package randomly hangs during execution

    - by Adam MacLeod
    Hi Guys, I am having an ongoing and painful problem with an SSIS package. The package runs every 5 minutes as an SQL Agent Job and every 2-10 days the package will start running and never stop (thus preventing further executions). If I stop the hung job manually it will begin working perfectly again in the next 5 minute interval. The SSIS package is for moving data from an Oracle database to a MSSQL 2005 database. It has 7 steps: Step 1 calls an Oracle Stored Procedure to prepare the temporary tables inside ORACLE Steps 2-6 process the data from the ORACLE tables to the MSSQL tables ORACLE - MSSQL Step 7 calls an Oracle Stored Procedure to clear the ORACLE temporary tables I suspect that the issue is caused by a communications error between the MSSQL server and the ORACLE server. Both the MSSQL database and Agent/package run on one machine with the ORACLE database running over the network. I have enabled logging of the SQL package and after more than 2GB of log file I have captured the instant where the package stops responding: OnPreValidate,ADV-SRV5,NT AUTHORITY\SYSTEM,CallistaIntegrationToMonashCRM_delta,{F88F6C45-CFA2-4801-A2F2-DDF03D458A48},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,(null) OnPreValidate,ADV-SRV5,NT AUTHORITY\SYSTEM,Address,{c5907799-f918-43da-818a-d4bd7f188367},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,(null) OnInformation,ADV-SRV5,NT AUTHORITY\SYSTEM,Address,{c5907799-f918-43da-818a-d4bd7f188367},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,1074016266,0x,Validation phase is beginning. OnProgress,ADV-SRV5,NT AUTHORITY\SYSTEM,Address,{c5907799-f918-43da-818a-d4bd7f188367},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,Validating Diagnostic,ADV-SRV5,NT AUTHORITY\SYSTEM,Callista,{cb5d6fe3-3ea4-4453-8e5a-965818021df7},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,ExternalRequest_pre: The object is ready to make the following external request: 'IDataInitialize::GetDataSource'. Diagnostic,ADV-SRV5,NT AUTHORITY\SYSTEM,Callista,{cb5d6fe3-3ea4-4453-8e5a-965818021df7},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,ExternalRequest_post: 'IDataInitialize::GetDataSource succeeded'. The external request has completed. Diagnostic,ADV-SRV5,NT AUTHORITY\SYSTEM,Callista,{cb5d6fe3-3ea4-4453-8e5a-965818021df7},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,ExternalRequest_pre: The object is ready to make the following external request: 'IDBInitialize::Initialize'. These messages show the entire log generated for the failed run, for a successful run the output is typically ~2500 lines. I can see that the package is hanging during the initialize operation on the Callista connection (ORACLE database). I have not been able to work out a way to either fix this issue or have the package die gracefully (an error to the log would be A-OK with me). Any help or advice would be greatly appreciated.

    Read the article

  • "Bad binary signature" in ASP.NET MVC application

    - by David M
    We are getting the error above on some pages of an ASP.NET MVC application when it is deployed to a 64 bit Windows 2008 server box. It works fine on our development machines, though these are 32 bit XP. Just wondered if anyone had encountered this before, and has any suggestions? Details as follows: Bad binary signature. (Exception from HRESULT: 0x80131192) Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Runtime.InteropServices.COMException: Bad binary signature. (Exception from HRESULT: 0x80131192) All projects are set to compile for Any CPU, and are compiled in Release mode. The ASP.NET site is precompiled, and the precompiled build is on a 64 bit Windows 2008 TeamCity build agent. Thanks in advance. EDIT We're still plagued by this. I have looked at all the binaries in the website's bin directory using corflags.exe. None has the 32BIT flag set, and all have a CorFlags value of 9 except for Antlr3.Runtime.dll which has a value of 1. The problem only affects certain pages, and it seems to be those which use FluentValidation (including FluentValidation.Mvc and FluentValidation.xValIntegration assemblies). None of these shows anything out of the ordinary when inspected with corflags.exe, and there are no odd looking dependencies revealed by ildasm. When built locally (32 bit Windows XP) the site deploys and runs fine. When built on the build agents (64 bit Windows 2008 Server) the site displays these errors. The site runs in Integrated Pipeline mode, and is not set to 32 bit. The stack trace is: [COMException (0x80131192): Bad binary signature. (Exception from HRESULT: 0x80131192)] ASP.views_user_newinternal_aspx.__RenderContent2(HtmlTextWriter __w, Control parameterContainer) in e:\TeamCity\buildAgent\work\605ee6b4a5d1dd36\...Admin.Mvc\Views\User\NewInternal.aspx:53 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +115 ASP.views_shared_site_master.__Render__control1(HtmlTextWriter __w, Control parameterContainer) in e:\TeamCity\buildAgent\work\605ee6b4a5d1dd36\...Admin.Mvc\Views\Shared\Site.Master:26 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +115 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +240 System.Web.UI.Page.Render(HtmlTextWriter writer) +38 System.Web.Mvc.ViewPage.Render(HtmlTextWriter writer) +94 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +4240

    Read the article

  • channel factory null on debug?

    - by Garrith
    When I try to invoke a GetData contract using wcf rest in wcf test client mode I get this message: The Address property on ChannelFactory.Endpoint was null. The ChannelFactory's Endpoint must have a valid Address specified. at System.ServiceModel.ChannelFactory.CreateEndpointAddress(ServiceEndpoint endpoint) at System.ServiceModel.ChannelFactory`1.CreateChannel() at System.ServiceModel.ClientBase`1.CreateChannel() at System.ServiceModel.ClientBase`1.CreateChannelInternal() at System.ServiceModel.ClientBase`1.get_Channel() at Service1Client.GetData(String value) This is the config file for the host: <system.serviceModel> <services> <service name="WcfService1.Service1" behaviorConfiguration="WcfService1.Service1Behavior"> <!-- Service Endpoints --> <endpoint address="http://localhost:26535/Service1.svc" binding="webHttpBinding" contract="WcfService1.IService1" behaviorConfiguration="webHttp" > <!-- Upon deployment, the following identity element should be removed or replaced to reflect the identity under which the deployed service runs. If removed, WCF will infer an appropriate identity automatically. --> <identity> <dns value="localhost"/> </identity> </endpoint> <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange"/> </service> </services> <behaviors> <serviceBehaviors> <behavior name="WcfService1.Service1Behavior"> <!-- To avoid disclosing metadata information, set the value below to false and remove the metadata endpoint above before deployment --> <serviceMetadata httpGetEnabled="true"/> <!-- To receive exception details in faults for debugging purposes, set the value below to true. Set to false before deployment to avoid disclosing exception information --> <serviceDebug includeExceptionDetailInFaults="false"/> </behavior> </serviceBehaviors> <endpointBehaviors> <behavior name="webHttp"> <webHttp/> </behavior> </endpointBehaviors> </behaviors> </system.serviceModel> </configuration> Code: [ServiceContract(Namespace = "")] public interface IService1 { //[WebInvoke(Method = "POST", UriTemplate = "Data?value={value}")] [OperationContract] [WebGet(UriTemplate = "/{value}")] string GetData(string value); [OperationContract] CompositeType GetDataUsingDataContract(CompositeType composite); // TODO: Add your service operations here } public class Service1 : IService1 { public string GetData(string value) { return string.Format("You entered: {0}", value); }

    Read the article

  • .Net Custom Configuration Section and Saving Changes within PropertyGrid

    - by Paul
    If I load the My.Settings object (app.config) into a PropertyGrid, I am able to edit the property inside the propertygrid and the change is automatically saved. PropertyGrid1.SelectedObject = My.Settings I want to do the same with a Custom Configuration Section. Following this code example (from here http://www.codeproject.com/KB/vb/SerializePropertyGrid.aspx), he is doing explicit serialization to disk when a "Save" button is pushed. Public Class Form1 'Load AppSettings Dim _appSettings As New AppSettings() Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click _appSettings = AppSettings.Load() ' Actually change the form size Me.Size = _appSettings.WindowSize PropertyGrid1.SelectedObject = _appSettings End Sub Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button2.Click _appSettings.Save() End Sub End Class In my code, my custom section Inherits from ConfigurationSection (see below) Question: Is there something built into ConfigurationSection class that does the autosave? If not, what is the best way to handle this, should it be in the PropertyGrid.PropertyValueChagned? (how does the My.Settings handle this internally?) Here is the example Custom Class that I am trying to get to auto-save and how I load into property grid. Dim config As System.Configuration.Configuration = _ ConfigurationManager.OpenExeConfiguration( _ ConfigurationUserLevel.None) PropertyGrid2.SelectedObject = config.GetSection("CustomSection") Public NotInheritable Class CustomSection Inherits ConfigurationSection ' The collection (property bag) that contains ' the section properties. Private Shared _Properties As ConfigurationPropertyCollection ' The FileName property. Private Shared _FileName As New ConfigurationProperty("fileName", GetType(String), "def.txt", ConfigurationPropertyOptions.IsRequired) ' The MasUsers property. Private Shared _MaxUsers _ As New ConfigurationProperty("maxUsers", _ GetType(Int32), 1000, _ ConfigurationPropertyOptions.None) ' The MaxIdleTime property. Private Shared _MaxIdleTime _ As New ConfigurationProperty("maxIdleTime", _ GetType(TimeSpan), TimeSpan.FromMinutes(5), _ ConfigurationPropertyOptions.IsRequired) ' CustomSection constructor. Public Sub New() _Properties = New ConfigurationPropertyCollection() _Properties.Add(_FileName) _Properties.Add(_MaxUsers) _Properties.Add(_MaxIdleTime) End Sub 'New ' This is a key customization. ' It returns the initialized property bag. Protected Overrides ReadOnly Property Properties() _ As ConfigurationPropertyCollection Get Return _Properties End Get End Property <StringValidator( _ InvalidCharacters:=" ~!@#$%^&*()[]{}/;'""|\", _ MinLength:=1, MaxLength:=60)> _ <EditorAttribute(GetType(System.Windows.Forms.Design.FileNameEditor), GetType(System.Drawing.Design.UITypeEditor))> _ Public Property FileName() As String Get Return CStr(Me("fileName")) End Get Set(ByVal value As String) Me("fileName") = value End Set End Property <LongValidator(MinValue:=1, _ MaxValue:=1000000, ExcludeRange:=False)> _ Public Property MaxUsers() As Int32 Get Return Fix(Me("maxUsers")) End Get Set(ByVal value As Int32) Me("maxUsers") = value End Set End Property <TimeSpanValidator(MinValueString:="0:0:30", _ MaxValueString:="5:00:0", ExcludeRange:=False)> _ Public Property MaxIdleTime() As TimeSpan Get Return CType(Me("maxIdleTime"), TimeSpan) End Get Set(ByVal value As TimeSpan) Me("maxIdleTime") = value End Set End Property End Class 'CustomSection

    Read the article

  • Apache HttpClient Digest authentication

    - by Milan Jovic
    Hi, Basically what I need to do is to perform digest authentication. First thing I tried is the official example available here. But when I try to execute it(with some small changes, Post instead of the the Get method) I get a org.apache.http.auth.MalformedChallengeException: missing nonce in challange at org.apache.http.impl.auth.DigestScheme.processChallenge(DigestScheme.java:132) When this failed I tried using: DefaultHttpClient client = new DefaultHttpClient(); client.getCredentialsProvider().setCredentials(new AuthScope(null, -1, null), new UsernamePasswordCredentials("<username>", "<password>")); HttpPost post = new HttpPost(URI.create("http://<someaddress>")); List<NameValuePair> nvps = new ArrayList<NameValuePair>(); nvps.add(new BasicNameValuePair("domain", "<username>")); post.setEntity(new UrlEncodedFormEntity(nvps, HTTP.UTF_8)); DigestScheme digestAuth = new DigestScheme(); digestAuth.overrideParamter("algorithm", "MD5"); digestAuth.overrideParamter("realm", "http://<someaddress>"); digestAuth.overrideParamter("nonce", Long.toString(new Random().nextLong(), 36)); digestAuth.overrideParamter("qop", "auth"); digestAuth.overrideParamter("nc", "0"); digestAuth.overrideParamter("cnonce", DigestScheme.createCnonce()); Header auth = digestAuth.authenticate(new UsernamePasswordCredentials("<username>", "<password>"), post); System.out.println(auth.getName()); System.out.println(auth.getValue()); post.setHeader(auth); HttpResponse ret = client.execute(post); ByteArrayOutputStream v2 = new ByteArrayOutputStream(); ret.getEntity().writeTo(v2); System.out.println("----------------------------------------"); System.out.println(v2.toString()); System.out.println("----------------------------------------"); System.out.println(ret.getStatusLine().getReasonPhrase()); System.out.println(ret.getStatusLine().getStatusCode()); At first I have only overridden "realm" and "nonce" DigestScheme parameters. But it turned out that PHP script running on the server requires all other params, but no matter if I specify them or not DigestScheme doesn't generate them when I call its authenticate() method. I've been struggling with this for two days, and no luck. Based on everything I think that the cause of the problem is the PHP script. It looks to me that it doesn't send a challenge when app tries to access it unauthorized. Any ideas anyone?

    Read the article

  • ASP.NET Web Service Throws 401 (unauthorized) Error

    - by user268611
    Hi Experts, I have this .NET application to be run in an intranet environment. It is configured so that it requires Windows Authentication before you can access the website (Anonymous access is disabled). This website calls a web service (enable anonymous access) and the web service calls the DB. We do have a token-based authentication between the web application and the web service to secure the communication between them. The issue I'm facing is that when I deploy this to production, I'm having an intermittent issue whereby the communication between the web application and the web service failed. The 401 issue was thrown. This is actually working fine in our QA environment. Is this an issue with Active Directory? Or could it be an isssue with FQDN as mentioned here: http://support.microsoft.com/default.aspx?scid=kb;en-us;896861? The weirdest thing is that this is happening intermittently when tested in both on the server itself and a remote workstation in my client's environment. But, this is working perfectly in my environment. OS: Windows Server SP1 IIS 6 .NET 3.5 Framework Any idea about the 401 (Unauthorized) issue?? Thx for the help... This is from the log... Event code: 3005 Event message: An unhandled exception has occurred. Event time: 4/5/2010 10:44:57 AM Event time (UTC): 4/5/2010 2:44:57 AM Event ID: 6c8ea2607b8d4e29a7f0b1c392b1cb21 Event sequence: 155112 Event occurrence: 2 Event detail code: 0 Application information: Application domain: xxx Trust level: Full Application Virtual Path: xxx Application Path: xxx Machine name: xxx Process information: Process ID: 4424 Process name: w3wp.exe Account name: NT AUTHORITY\NETWORK SERVICE Exception information: Exception type: WebException Exception message: The request failed with HTTP status 401: Unauthorized. Request information: Request URL: http://[ip]/[app_path] Request path: xxx User host address: [ip] User: xxx Is authenticated: True Authentication Type: Negotiate Thread account name: xxx Thread information: Thread ID: 6 Thread account name: xxx Is impersonating: False Stack trace: at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall) at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) at wsVulnerabilityAdvisory.Service.test() at test.Page_Load(Object sender, EventArgs e) at System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) at System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) at System.Web.UI.Control.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

    Read the article

  • XmlReader throws on an RSS feed, when the RSS document includes an embedded <script> block.

    - by Cheeso
    The code: using (XmlReader xmlr = XmlReader.Create(new StringReader(allXml))) { var items = from item in SyndicationFeed.Load(xmlr).Items select item; } The exception: Exception: System.Xml.XmlException: Unexpected node type Element. ReadElementString method can only be called on elements with simple or empty content. Line 11, position 25. at System.Xml.XmlReader.ReadElementString() at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadXml(XmlReader reader, SyndicationFeed result) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFeed(XmlReader reader) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFrom(XmlReader reader) at System.ServiceModel.Syndication.SyndicationFeed.Load[TSyndicationFeed](XmlReader reader) at System.ServiceModel.Syndication.SyndicationFeed.Load(XmlReader reader) at Ionic.ToolsAndTests.ReadRss.Run() in c:\dev\dotnet\ReadRss.cs:line 90 The XML content: <?xml version="1.0" encoding="utf-8"?> <?xml-stylesheet type="text/xsl" href="https://www.ibm.com/developerworks/mydeveloperworks/blogs/roller-ui/styles/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom" > <channel> <title>Software architecture, software engineering, and Renaissance Jazz</title> <link>https://www.ibm.com/developerworks/mydeveloperworks/blogs/gradybooch</link> <atom:link rel="self" type="application/rss+xml" href="https://www.ibm.com/developerworks/mydeveloperworks/blogs/gradybooch/feed/entries/rss?lang=en" /> <description>Software architecture, software engineering, and Renaissance Jazz</description> <language>en-us</language> <copyright>Copyright <script type='text/javascript'> document.write(blogsDate.date.localize (1273534889181));</script></copyright> <lastBuildDate>Mon, 10 May 2010 19:41:29 -0400</lastBuildDate> As you can see, on line 11, at position 25, there's a script block inside the <copyright> element. Other people have reported similar errors with other XML documents. The way I worked around this was to do a StreamReader.ReadToEnd, then do Regex.Replace on the result of that to yank out the script block, before passing the modified string to XmlReader.Create(). Feels like a hack. Has anyone got a better approach? I don't like this because I have to read in a 125k string into memory. Is it valid rss to include "complex content" like that - a script block inside an element? Thanks.

    Read the article

  • ASP.Net and FaceBook Wall integration

    - by Jit
    Hi, I am new to FaceBook API. I like to send messages when I want from my aspx webpage. I used the different examples and used FaceBook API 3.1. I am getting error as "service temporarily unavailable". I have assigned AppKey, Secret Key and session key. My sample code is below. using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; using Facebook; using Facebook.Web; using Facebook.Rest; using Facebook.Session; using Facebook.Utility; using Facebook.BindingHelper; using Facebook.Schema; using Facebook.Web.FbmlControls; using Facebook.Session.DesktopPopup; namespace FaceBookApp { public partial class _Default : System.Web.UI.Page { private const string ApplicationKey = "MY_APP_KEY"; private const string SecretKey = "MY_SECRET_KEY"; private Api fbAPI; private ConnectSession _connectSession; // private List<EventUser> _eventUsers; protected void Page_Load(object sender, EventArgs e) { } protected void Button1_Click(object sender, EventArgs e) { _connectSession = new ConnectSession(ApplicationKey, SecretKey); _connectSession.SessionKey = "MY_SESSION_KEY"; if (!_connectSession.IsConnected()) { // Not authenticated, proceed as usual. //lblStatus.Text = "Please sign-in with Facebook."; } else { // Authenticated, create API instance fbAPI = new Api(_connectSession); string response = fbAPI.Stream.Publish("publish steven on facebook."); } } } } The error message is confusing. It must be a very simple solution. I don't know what I am missing. Pls help. Here is the facebook exception. {"Service temporarily unavailable"}

    Read the article

  • Autoloading Development or Production configs (best practices)

    - by Xeoncross
    When programming sites you usually have one set of config files for the development environment and another set for the production server (or one file with both settings). I am assuming all projects should be handled by version control like git or svn. Manual file transfers (like FTP) is wrong on so many levels. How you enable/disable the correct settings (so that your system knows which ones to use) is a problem for me. Each system I work on just kind of jimmy-rigs a solution. Below are the 3 methods I know of and I am hoping that someone can submit a more elegant solutions. 1) File Based The system loads a folder structure based on the URL requested. /site.com /site.fakeTLD /lib index.php For example, if the url is http://site.com then the system loads the production config files located in the site.com folder. However, if I'm working on the site locally I visit http://site.fakeTLD to work on the local copy of the site. To setup this I edit my hosts file and add site.fakeTLD to point to my own computer (127.0.0.1/localhost) and then create a vhost in apache. So now I can work on the codebase locally and then push to the server without any trouble. The problem is that this is susceptible to a "host" injection attack. So someone loading site.com could set the host to site.fakeTLD and then the system would load my development config files instead of production. 2) Config Based The config files contain on section for development - and one for production. The problem is that each time you go to push your changes to the repo you have to edit the file to specify which set of config options should be used. $use = 'production'; //'development'; This leaves the repo open to human error should one of the developers forget to enable the right setting. 3) File System Check Based All the development machines have an extra empty file called "development.txt" or something. Each time the system loads it checks for this file - if found then it knows it is in development mode - if missing then it knows it is in production mode. Since the file is NEVER ADDED to the repo then it will never be pushed (and checked out) on the production machine. However, this just doesn't feel right and causes a slight slow down since all filesystem checks are slow. Is there anyway that the server can auto-detect wither to use the development or production configs?

    Read the article

  • trouble resolving location in <xs:import > element in C#

    - by BobC
    I'm using an XML schema document to validate incoming data documents, however the schema appears be failing during compilation at run time because it refers to a complex type which part of an external schema. The external schema is specified in a element at the top of the document. I had thought it might be an access problem, so I moved a copy of the external document to a localhost folder. I get the same error, so now I'm wondering if there might be some sort of issue with the use of the element. The schema document fragment looks like this: <xs:schema targetNamespace="http://www.smpte-ra.org/schemas/429-7/2006/CPL" xmlns:cpl="http://www.smpte-ra.org/schemas/429-7/2006/CPL" xmlns:ds="http://www.w3.org/2000/09/xmldsig#" xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified" attributeFormDefault="unqualified"> ... <xs:import namespace="http://www.w3.org/2000/09/xmldsig#" schemaLocation="http://localhost/TMSWebServices/XMLSchema/xmldsig-core-schema.xsd"/> ... <xs:element name="Signer" type="ds:KeyInfoType" minOccurs="0"/> ... </xs:schema> The code I'm trying to run this with is real simple (got it from http://dotnetslackers.com/Community/blogs/haissam/archive/2008/11/06/validate-xml-against-xsd-xml-schema-using-c.aspx) string XSDFILEPATH = @"http://localhost/TMSWebServices/XMLSchema/CPL.xsd"; string XMLFILEPATH = @"C:\foo\bar\files\TestCPLs\CPL_930f5e92-be03-440c-a2ff-a13f3f16e1d6.xml"; System.Xml.XmlReaderSettings settings = new System.Xml.XmlReaderSettings(); settings.Schemas.Add(null, XSDFILEPATH); settings.ValidationType = System.Xml.ValidationType.Schema; System.Xml.XmlDocument document = new System.Xml.XmlDocument(); document.Load(XMLFILEPATH); System.Xml.XmlReader rdr = System.Xml.XmlReader.Create(new StringReader(document.InnerXml), settings); while (rdr.Read()) { } Everything goes well until the line that instantiates the XMLReader object just before the while loop. Then it fails with a type not declared error. The type that it's trying to find, KeyInfoType, is defined in one of the the documents in the import element. I've made sure the namespaces line up. I wondered if the # signs in the namespace definitions were causing a problem, but removing them had no effect, it just changed what the error looked like (i.e. "Type 'http://www.w3.org/2000/09/xmldsig:KeyInfoType' is not declared." versus "Type 'http://www.w3.org/2000/09/xmldsig#:KeyInfoType' is not declared.") My suspicion is that there's something about the processing of the element that I'm missing. Any suggestions are very welcome. Thanks!

    Read the article

  • Java Threading Concept Understanding

    - by Nirmal
    Hello All... Recently I have gone through with one simple threading program, which leads me some issues for the related concepts... My sample program code looks like : class NewThread implements Runnable { Thread t; NewThread() { t = new Thread(this, "Demo Thread"); System.out.println("Child thread: " + t); t.start(); // Start the thread } public void run() { try { for (int i = 5; i > 0; i--) { System.out.println("Child Thread: " + i); Thread.sleep(500); } } catch (InterruptedException e) { System.out.println("Child interrupted."); } System.out.println("Exiting child thread."); } } class ThreadDemo { public static void main(String args[]) { new NewThread(); // create a new thread try { for (int i = 5; i > 0; i--) { System.out.println("Main Thread: " + i); Thread.sleep(1000); } } catch (InterruptedException e) { System.out.println("Main thread interrupted."); } System.out.println("Main thread exiting."); } } Now this program giving me the output as follows : Child thread: Thread[Demo Thread,5,main] Main Thread: 5 Child Thread: 5 Child Thread: 4 Main Thread: 4 Child Thread: 3 Child Thread: 2 Main Thread: 3 Child Thread: 1 Exiting child thread. Main Thread: 2 Main Thread: 1 Main thread exiting. So, that's very much clear to me. But as soon as I am replacing the object creation code (calling of a NewThread class constructor) to as follows : NewThread nt = new NewThread(); // create a new thread the output becomes a bit varied like as follows : Child thread: Thread[Demo Thread,5,main] Main Thread: 5 Child Thread: 5 Child Thread: 4 Child Thread: 3 Main Thread: 4 Child Thread: 2 Child Thread: 1 Main Thread: 3 Exiting child thread. Main Thread: 2 Main Thread: 1 Main thread exiting. And some times it's giving me same output in both the cases. So, i am not getting the exact change in both the scenario. I would like to know that you the variation in the output is coming here ? Thanks in advance...

    Read the article

< Previous Page | 306 307 308 309 310 311 312 313 314 315 316 317  | Next Page >