Search Results

Search found 22794 results on 912 pages for 'xna content pipeline'.

Page 53/912 | < Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >

  • stoppin pipeline - POwershell

    - by laertejuniordba
    Hi all I am using New-Object System.Management.Automation.PipelineStoppedException To stop the pipeline OK..works.. But how Can I test in the next cmdlet if was stopped ? Foo1 | foo2 | foo3 Stop in foo1, but goes to foo2. I want to test if was stopped in foo1 to stop too in each function function foo1 { process { try { #do } catch { New-Object System.Management.Automation.PipelineStoppedException } } and still going to the next cmdlet, as stopped by error I want to stop in each next cmdlets..:) Thanks !!!

    Read the article

  • where to find Microsoft.SqlServer.Dts.Pipeline

    - by CoffeeAddict
    I'm opening a 2005 SSIS pakage and also an old C# project..both are in this solution here. I'm missing namespaces and I can't find the assemblies to add back to my references folder for my C# Project Microsoft.SqlServer.Dts.Pipeline for example is not one I find in the list of references in the .NET references tab. So how the hell do I get these SQL Server assemblies? Do I have to install the SQL Server 2008 sdk? Lost.

    Read the article

  • Monogame - Shader parameters missing

    - by Layoric
    I am currently working on a simple game that I am building in Windows 8 using MonoGame (develop3d). I am using some shader code from a tutorial (made by Charles Humphrey) and having an issue populating a 'texture' parameter. I'm not well versed writing shaders, so this might be caused by a more obvious problem. I have debugged through MonoGame's Content processor to see how this shader is being parsed, all the non 'texture' parameters are there and look to be loading correctly. Shader code below #include "PPVertexShader.fxh" float2 lightScreenPosition; float4x4 matVP; float2 halfPixel; float SunSize; texture flare; sampler2D Scene: register(s0){ AddressU = Clamp; AddressV = Clamp; }; sampler Flare = sampler_state { Texture = (flare); AddressU = CLAMP; AddressV = CLAMP; }; float4 LightSourceMaskPS(float2 texCoord : TEXCOORD0 ) : COLOR0 { texCoord -= halfPixel; // Get the scene float4 col = 0; // Find the suns position in the world and map it to the screen space. float2 coord; float size = SunSize / 1; float2 center = lightScreenPosition; coord = .5 - (texCoord - center) / size * .5; col += (pow(tex2D(Flare,coord),2) * 1) * 2; return col * tex2D(Scene,texCoord); } technique LightSourceMask { pass p0 { VertexShader = compile vs_4_0 VertexShaderFunction(); PixelShader = compile ps_4_0 LightSourceMaskPS(); } } I've removed default values as they are currently not support in MonoGame and also changed ps and vs to v4 instead of 2. Could this be causing the issue? As I debug through 'DXConstantBufferData' constructor (from within the MonoGameContentProcessing project) I find that the 'flare' parameter does not exist. All others seem to be getting created fine. Any help would be appreciated.

    Read the article

  • Text contains characters that cannot be resolved by this SpriteFont

    - by Appeltaart
    I'm trying to draw currency symbols in an Arial Spritefont. According to thissite, the font contains these symbols. I've included the Unicode character region for these symbols in the Content Processor SpriteFont file, like so: <CharacterRegions> <CharacterRegion> <Start>&#32;</Start> <End>&#126;</End> </CharacterRegion> <CharacterRegion> <!-- Currency symbols --> <Start>&#8352;</Start> <End>&#8378;</End> </CharacterRegion> </CharacterRegions> However, whenever I try to draw a € (Euro) symbol, I get this exception: Text contains characters that cannot be resolved by this SpriteFont Inspecting the SpriteFont in the IDE at that breakpoint shows the € symbol is in there. I'm at a loss at what could be wrong. Why is this exception thrown?

    Read the article

  • 3d Model Scaling With Camera

    - by spasarto
    I have a very simple 3D maze program that uses a first person camera to navigate the maze. I'm trying to scale the blocks that make up the maze walls and floor so the corridors seem more roomy to the camera. Every time I scale the model, the camera seems to scale with it, and the corridors always stay the same width. I've tried apply the scale to the model in the content pipe (setting the scale property of the model in the properties window in VS). I've also tried to apply the scale using Matrix.CreateScale(float) using the Scale-Rotate-Transform order with the same result. If I leave the camera speed the same, the camera moves slower, so I know it's traversing a larger distance, but the world doesn't look larger; the camera just seems slower. I'm not sure what part of the code to include since I don't know if it is an issue with my model, camera, or something else. Any hints at what I'm doing wrong? Camera: Projection = Matrix.CreatePerspectiveFieldOfView( MathHelper.PiOver4, _device.Viewport.AspectRatio, 1.0f, 1000.0f ); Matrix camRotMatrix = Matrix.CreateRotationX( _cameraPitch ) * Matrix.CreateRotationY( _cameraYaw ); Vector3 transCamRef = Vector3.Transform( _cameraForward, camRotMatrix ); _cameraTarget = transCamRef + CameraPosition; Vector3 camRotUpVector = Vector3.Transform( _cameraUpVector, camRotMatrix ); View = Matrix.CreateLookAt( CameraPosition, _cameraTarget, camRotUpVector ); Model: World = Matrix.CreateTranslation( Position );

    Read the article

  • Monogame/SharpDX - Shader parameters missing

    - by Layoric
    I am currently working on a simple game that I am building in Windows 8 using MonoGame (develop3d). I am using some shader code from a tutorial (made by Charles Humphrey) and having an issue populating a 'texture' parameter as it appears to be missing. Edit I have also tried 'Texture2D' and using it with a register(t0), still no luck I'm not well versed writing shaders, so this might be caused by a more obvious problem. I have debugged through MonoGame's Content processor to see how this shader is being parsed, all the non 'texture' parameters are there and look to be loading correctly. Edit This seems to go back to D3D compiler. Shader code below: #include "PPVertexShader.fxh" float2 lightScreenPosition; float4x4 matVP; float2 halfPixel; float SunSize; texture flare; sampler2D Scene: register(s0){ AddressU = Clamp; AddressV = Clamp; }; sampler Flare = sampler_state { Texture = (flare); AddressU = CLAMP; AddressV = CLAMP; }; float4 LightSourceMaskPS(float2 texCoord : TEXCOORD0 ) : COLOR0 { texCoord -= halfPixel; // Get the scene float4 col = 0; // Find the suns position in the world and map it to the screen space. float2 coord; float size = SunSize / 1; float2 center = lightScreenPosition; coord = .5 - (texCoord - center) / size * .5; col += (pow(tex2D(Flare,coord),2) * 1) * 2; return col * tex2D(Scene,texCoord); } technique LightSourceMask { pass p0 { VertexShader = compile vs_4_0 VertexShaderFunction(); PixelShader = compile ps_4_0 LightSourceMaskPS(); } } I've removed default values as they are currently not support in MonoGame and also changed ps and vs to v4 instead of 2. Could this be causing the issue? As I debug through 'DXConstantBufferData' constructor (from within the MonoGameContentProcessing project) I find that the 'flare' parameter does not exist. All others seem to be getting created fine. Any help would be appreciated. Update 1 I have discovered that SharpDX D3D compiler is what seems to be ignoring this parameter (perhaps by design?). The ConstantBufferDescription.VariableCount seems to be not counting the texture variable. Update 2 SharpDX function 'GetConstantBuffer(int index)' returns the parameters (minus textures) which is making is impossible to set values to these variables within the shader. Any one know if this is normal for DX11 / Shader Model 4.0? Or am I missing something else?

    Read the article

  • Best practice way to handle variable content margin?

    - by Aithne
    Ok, so a quick site that I am throwing together for a friend has about 20 static pages. Each one with a small amount of content. the div Container contains, well, the div Content, and the div Content contains, obviously, the content that changes on each page. Now, depending on the length of the content, I want a different margin at the top. The less content, then the larger the margin. Simply an aesthetic choice. For example, if the content almost fills the static sized container, there is less padding, but a 1 line page of content might be 1/3 of the way down the static container. Centering the content in the div wont do, as that creates too large of a margin. Whats the best way to handle this? A new class for each content with a different margin? A new Id, so that it is in its own special div positioned or margined differently? Inline css on each page to override the standard css for div Content? A differnt spacer div inside Container before Content on each page? Some sort of scripting along the lines of margin of Content = (ContainerHeight - ContentHeight) / 3? Whats the acceptable way of doing this? I don't want to get into bad habits.

    Read the article

  • Looking for a RESTful or SOAP pipeline between WordPress and InterWoven TeamSite

    - by deanpeters
    I've been Googling my brains out trying see if there's a simple way to bridge content to and from WordPress to and from TeamSite. I'm coming at this from the perspective of a WordPress developer. I see in the book "The Definitive Guide to Interwoven TeamSite" (http://bit.ly/d3z4wI) mention of objects for the Interwoven LiveSite product: com.interwoven.livesite.external.impl.RSS com.interwoven.livesite.external.impl.SOAP If I understand the above objects correctly, these allow me to instantiate objects of these data types, which after populating them via various method calls, allow me to render content using com.interwoven.livesite.external.ExternalCall ... but I'm not sure. Nor do I think this approach provides me the 2-way street I seek. As it stands now, from my limited understanding, it appears that the least path of resistance is deploying Interwoven's LiveSite with the existing TeamSite implementation so content can be both consumed and rendered via RSS ... an channel which WordPress can produce and consume; the latter with plugins such as wp-o-matic and/or feedpress. So the question is, does anyone out there have experience with a SOAP or RESTful API approach to InterWoven's TeamSite? If so, can I get some direction on documentation? Or is the addition of LiveSite + RSS the most feasible 2-way channel?

    Read the article

  • Enterprise 2.0 - Connecting People, Processes & Content

    - by kellsey.ruppel(at)oracle.com
    With recent technological advances, the Internet is changing. When users head to the web, they are no longer just looking for information from a simple text and picture based website. Users want a more interactive experience - they want to participate, to share their views and get the feedback of others. And this is precisely what Web 2.0 technology addresses. Web 2.0 is about web applications that facilitate interactive information sharing, user-centered design and collaboration on the World Wide Web. Web 2.0 technology is everywhere on the Internet and is radically changing the speed and medium in which we interact and communicate. There are thousands of examples in the consumer world of Web 2.0 applications, technologies and solutions at work. You might be familiar with some of them...blogs, wikis (Wikipedia), Twitter, Facebook, LinkedIn - these are all examples of Web 2.0. And these technologies are transforming our world into a real-time, participation-oriented, user-driven, content-centric world. With all of these Web 2.0 solutions it's about the user, the consumer and all the content they are generating. It's a world full of online communities where people share and participate. We're not talking about disseminating information top-down , nor is it a bottom-up fight. Everyone has an equal opportunity to participate and share. The more you participate, the more you share, the more valued you are in the community. The web is not just a collection of documents online. It is the social web.  For the active users in the community, staying connected becomes critically important so they can participate at anytime and from anywhere. And because feedback and interaction are so critical, time is of the essence. When everyone is providing immediate responses, you feel the urge to do the same. Hence everything needs to be done right now, together...and collaboratively. With all the content being generated online by users, there is complete information overload out there. (That's a good thing for Google). But...it's no longer just about search. Sometimes you want the information to just come to you. Recommendations and discovery engines will deliver you more applicable results than a non-contextual search. How many of you have heard about a news headline on Facebook as part of your feed before you read the paper or see it on TV? This is how the new generation of workers live their daily lives...and as they enter the workforce, these trends and technologies are showing up in the enterprise too. A lot of the Web 2.0 technologies and solutions in the consumer world are geared for just that....consumers. But the core concepts that put them into the Web 2.0 category can be applied to the enterprise as well. And that is what we mean when we talk about Enterprise 2.0. Enterprise 2.0 is the use of Web 2.0 tools and technologies in the workplace.  It provides a modern user experience by connecting the people, content and business processes inside and outside the enterprise. Enterprise 2.0 empowers users to collaborate more effectively, find and share information in the proper content and improves the overall business processes which they participate in. As we head into 2011, is your organization using Enterprise 2.0 capabilities to the fullest? Are you connecting your people, processes and content together to provide a modern user experience?

    Read the article

  • Illegal characters for SharePoint 2010 Content Type name

    - by Kelly Jones
    Quick tip: you can’t include a backslash in the name of the SharePoint 2010 Content Type.  In fact, there are several illegal characters:  \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab. What, you didn’t know that after entering one of these characters in the name?  Is it because you saw this screen: Oh, that’s right….you need to turn off custom errors in the layouts folder…See this blog post for details and you’ll also need to turn off for the web application. Once you do that, you’ll see this: I wonder why the SharePoint team just doesn’t let the user know that the content type name contains illegal characters before the user hits the create button. Here’s a copy of the complete error (for the search engines): Server Error in '/' Application. -------------------------------------------------------------------------------- The content type name 'asdfadsf\asdfasf' cannot contain: \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: Microsoft.SharePoint.SPInvalidContentTypeNameException: The content type name 'asdfadsf\asdfasf' cannot contain: \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.  Stack Trace: [SPInvalidContentTypeNameException: The content type name 'asdfadsf\asdfasf' cannot contain: \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab.]    Microsoft.SharePoint.SPContentType.ValidateName(String name) +27419522    Microsoft.SharePoint.SPContentType.ValidateNameWithResource(String strVal, String& strLocalized) +423    Microsoft.SharePoint.SPContentType.set_Name(String value) +151    Microsoft.SharePoint.SPContentType.Initialize(SPContentType parentContentType, SPContentTypeCollection collection, String name) +112    Microsoft.SharePoint.SPContentType..ctor(SPContentType parentContentType, SPContentTypeCollection collection, String name) +132    Microsoft.SharePoint.ApplicationPages.ContentTypeCreatePage.BtnOK_Click(Object sender, EventArgs e) +497    System.Web.UI.WebControls.Button.OnClick(EventArgs e) +115    System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +140    System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +29    System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +2981   -------------------------------------------------------------------------------- Version Information: Microsoft .NET Framework Version:2.0.50727.4927; ASP.NET Version:2.0.50727.4927

    Read the article

  • C# XNA: Effecient mesh building algorithm for voxel based terrain ("top" outside layer only, non-destructible)

    - by Tim Hatch
    To put this bluntly, for non-destructible/non-constructible voxel style terrain, are generated meshes handled much better than instancing? Is there another method to achieve millions of visible quad faces per scene with ease? If generated meshes per chunk is the way to go, what kind of algorithm might I want to use based on only EVER needing the outer layer rendered? I'm using 3D Perlin Noise for terrain generation (for overhangs/caves/etc). The layout is fantastic, but even for around 20k visible faces, it's quite slow using instancing (whether it's one big draw call or multiple smaller chunks). I've simplified it to the point of removing non-visible cubes and only having the top faces of my cube-like terrain be rendered, but with 20k quad instances, it's still pretty sluggish (30fps on my machine). My goal is for the world to be made using quite small cubes. Where multiple games (IE: Minecraft) have the player 1x1 cube in width/length and 2 high, I'm shooting for 6x6 width/length and 9 high. With a lot of advantages as far as gameplay goes, it also means I could quite easily have a single scene with millions of truly visible quads. So, I have been trying to look into changing my method from instancing to mesh generation on a chunk by chunk basis. Do video cards handle this type of processing better than separate quads/cubes through instancing? What kind of existing algorithms should I be looking into? I've seen references to marching cubes a few times now, but I haven't spent much time investigating it since I don't know if it's the better route for my situation or not. I'm also starting to doubt my need of using 3D Perlin noise for terrain generation since I won't want the kind of depth it would seem best at. I just like the idea of overhangs and occasional cave-like structures, but could find no better 'surface only' algorithms to cover that. If anyone has any better suggestions there, feel free to throw them at me too. Thanks, Mythics

    Read the article

  • Oracle@info360: Advance Beyond Point Solutions To An Enterprise Content Strategy

    - by kellsey.ruppel(at)oracle.com
    The info360/AIIM conference is March 22-24 in Washington DC. We have a number of customer speakers this year talking on the theme of “Advance Beyond Point Solutions To An Enterprise Content Strategy.” These customers all started by addressing a particular use case, but then used the infrastructure they had created to quickly and cost effectively stand up solutions to new business problems.  Andy MacMillan, VP of Product Management at Oracle, will give a thought provoking opening keynote at 8:50 AM on Tuesday, March 22nd. He will be joined by Juan Jose Goldschtein, the CIO of the Organization of American States. The OAS has developed a human rights website that is the front end to a case management system for human rights violations. The implementation supports digital signatures on iPads, so their executives can approve workflows and keep cases moving forward while they are busy traveling and investigating abuses.Other customer speakers include:Tom Robinette, Director of Applications and IT Engineering, Dresser-RandRobin Crisp, Program Manager, FDAMonica Crocker, Corporate Records Manager, Land O’ LakesBrian Skapura, The American Institute of ArchitectsKathy Adams and Leslie Becker, The Nature ConservancyIrfan Motiwala, Sr. VP, Moody’s Investment ServicesMolly Wenzler, Director of Electronic Media, MeadWestvaco Other sessions include our Super Session that kicks off the Oracle Track @info360 on Wednesday. At 11:00 AM, Senior Director of Product Marketing, Howard Beader will present The Social Enterprise – Combining People, Processes and Content. This session will focus on how customers have brought social media, business process management, and content management together to supercharge their organizations. Oracle customers can arrange one-on-one meetings with Oracle executives and product experts, and attend the VIP customer appreciation event. Oracle will be joined by Oracle partners:FujitsuKesteTeamInformaticsKapowSena SystemsDTIYou can learn more about discounts for Oracle customers and register on our Oracle@info360 page.To see more about the customers and sessions that will be presented, you can look at the Oracle Track page on the AIIM/info360 website.Technorati Tags: oracle, AIIM, info360, content management, social enterprise

    Read the article

  • Thread problem updating Windows Forms control in XNA C#

    - by Luis
    I'm development a network card game, and for now i've two players connected but there is a problem with one of them, this one can't do anything on the game. Looks that screen was blocked. I'm think that is because a code i used before. That code is: if (InvokeRequired) { this.Invoke(new MethodInvoker(delegate { ... })); return; } The code above is surrounding code to changing Button values, make connection with server and create game window. Without this code a warning is shown. InvalidOperationException was unhandled Cross-thread operation not valid: Control 'startGameButton' accessed from a thread other than the thread it was created on.

    Read the article

  • Thread problem with XNA C#

    - by Luis
    I'm development a network card game, and for now i've two players connected but there is a problem with one of them, this one can't do anything on the game. Looks that screen was blocked. I'm think that is because a code i used before. That code is: if (InvokeRequired) { this.Invoke(new MethodInvoker(delegate { ... })); return; } The code above is surrounding code to changing Button values, make connection with server and create game window. Without this code a warning is shown. InvalidOperationException was unhandled Cross-thread operation not valid: Control 'startGameButton' accessed from a thread other than the thread it was created on.

    Read the article

  • HTML Manifest for Content Folios

    - by Kyle Hatlestad
    I recently worked on a project to create a custom content folio renderer in WebCenter Content. It needed to output the native files in the folio along with a manifest file in HTML format which would list the contents of the folio along with any designated metadata and a relative link to the file within the download.  This way a person could hand someone the folio download and it would be a self-contained package with all of the content and a single file to display the information on the contents.  The default Zip rendition of the folio will output the web-viewable version of the file with an HDA formatted file for each one. And unless you are fluent in HDA or have a tool to read them, they are difficult to consume. I thought this might be useful for others, so I'm posting a copy of the component here. Beyond the standard instructions for installing a component, there is an environment configuration file (folionativezipwithmanifestrenderer_environment.cfg) which has a couple of options. FolioMetadataManifestList - This is a comma separated list of metadata fields (system or custom) that should be included in the manifest file. FolioMetadataManifestUseOriginalFilename - (True or False) If set to True, the filenames in the zip file will be based on the original filename as it was checked into WebCenter Content.  If False, it will use the 'Name' of the item as defined within the Folio.  This is usually the Title of the item. The component also includes the source code, so feel free to use this as a reference for creating other interesting folios. 

    Read the article

  • ContentPlaceHolders: Repeated Content

    - by brad
    Scenario I have an application using asp.net Master Pages in which I would like to repeat some content at the top and bottom of a page. Currently i use something like this: Master Page <html> <body> <asp:ContentPlaceHolder ID="Foo" runat="server"> </asp:ContentPlaceHolder> <!-- page content --> <asp:ContentPlaceHolder ID="Bar" runat="server"> </asp:ContentPlaceHolder> </body> </html> Content Page <asp:Content ID="Top" ContentPlaceHolderID="Foo" runat="server"> <!-- content --> </asp:Content> <asp:Content ID="Bottom" ContentPlaceHolderID="Bar" runat="server"> <!-- content repeated --> </asp:Content> Maintenance As you know, repeating things in code is usually not good. It creates maintenance problems. The following is what I would like to do but will obviously not work because of the repeated id attribute: Master Page <html> <body> <asp:ContentPlaceHolder ID="Foo" runat="server"> </asp:ContentPlaceHolder> <!-- page content --> <asp:ContentPlaceHolder ID="Foo" runat="server"> </asp:ContentPlaceHolder> </body> </html> Content Page <asp:Content ID="Top" ContentPlaceHolderID="Foo" runat="server"> <!-- content (no repetition) --> </asp:Content> Possible? Is there a way to do this using asp.net webforms? The solution does not necessarily have to resemble the above content, it just needs to work the same way. Notes I am using asp.net 3.0 in Visual Studio 2008

    Read the article

  • Default value list for pipeline param in Powershell

    - by fatcat1111
    I have a Powershell script that reads values off of the pipeline: PARAM ( [Parameter(ValueFromPipeline = $true)] $s ) PROCESS { echo "* $s" } Works just fine: PS my.ps1 foo * foo I would like the script to have list of default values, as the most common usage will always use the same values and storing them in the default will be most convenient. I did the usual assignment: PARAM ( [Parameter(ValueFromPipeline = $true)] $s = 'bar' ) PROCESS { echo "* $s" } Again, works just fine: PS my.ps1 * bar PS my.ps1 foo * foo However when setting the default to be a list, I get back something entirely reasonable but not at all what I want: PARAM ( [Parameter(ValueFromPipeline = $true)] $s = @('bar', 'bat', 'boy') ) PROCESS { echo "* $s" } Result: PS my.ps1 * bar bat boy I expected: PS my.ps1 * bar * bat * boy How can I get one call in to the Process loop for each default value? (This is somewhat different than getting one call in to Process, and wrapping the current body of in a big foreach loop over $s).

    Read the article

  • Best Practices for Content Types in SharePoint

    - by Anna Karin
    Hi all, Recently, we came across a severe problem in production farm with the Content Types. I would like to explain the background of this problem first. We have nice working feature for Content Types installation in production and test farms. We developed and deployed (using wsps) this SharePoint feature in Visual studio. We are using the publishing pages using page layouts and Content Types to help content editors to quickly publish the web pages. Unfortunately, some Content Types have been manually updated/added by some people in the production, so whenever I (developer) make some changes to the existing Content Types (using Visual Studio and feature activation/deactivation) , SharePoint removes one or two columns (during feature activation/deactivation) from Content Types; or the columns which have not been added in a best practice way. I think the best practice is to update Content Types using Visual Studio. Now, I wish to ensure that site columns shouldn't get removed from Content Types upon feature activation/deactivation. Note: Our feature for Content Type activation/deactivation doesn't hold any activation dependencies in the feature.xml

    Read the article

  • Make your CHM Help Files show HTML5 and CSS3 content

    - by Rick Strahl
    The HTML Help 1.0 specification aka CHM files, is pretty old. In fact, it's practically ancient as it was introduced in 1997 when Internet Explorer 4 was introduced. Html Help 1.0 is basically a completely HTML based Help system that uses a Help Viewer that internally uses Internet Explorer to render the HTML Help content. Because of its use of the Internet Explorer shell for rendering there were many security issues in the past, which resulted in locking down of the Web Browser control in Windows and also the Help Engine which caused some unfortunate side effects. Even so, CHM continues to be a popular help format because it is very easy to produce content for it, using plain HTML and because it works with many Windows application platforms out of the box. While there have been various attempts to replace CHM help files CHM files still seem to be a popular choice for many applications to display their help systems. The biggest alternative these days is no system based help at all, but links to online documentation. For Windows apps though it's still very common to see CHM help files and there are still a ton of CHM help out there and lots of tools (including our own West Wind Html Help Builder) that produce output for CHM files as well as Web output. Image is Everything and you ain't got it! One problem with the CHM engine is that it's stuck with an ancient Internet Explorer version for rendering. For example if you have help content that uses HTML5 or CSS3 content you might have an HTML Help topic like the following shown here in a full Web Browser instance of Internet Explorer: The page clearly uses some CSS3 features like rounded corners and box shadows that are rendered using plain CSS 3 features. Note that I used Internet Explorer on purpose here to demonstrate that IE9 on Windows 7 can properly render this content using some of the new features of CSS, but the same is true for all other recent versions of the major browsers (FireFox 3.1+, Safari 4.5+, WebKit 9+ etc.). Unfortunately if you take this nice and simple CSS3 content and run it through the HTML Help compiler to produce a CHM file the resulting output on the same machine looks a bit less flashy: All the CSS3 styling is gone and although the page display and functionality still works, but all the extra styling features are gone. This even though I am running this on a Windows 7 machine that has IE9 that should be able to render these CSS features. Bummer. Web Browser Control - perpetually stuck in IE 7 Mode The problem is the Web Browser/Shell Components in Windows. This component is and has been part of Windows for as long as Internet Explorer has been around, but the Web Browser control hasn't kept up with the latest versions of IE. In a nutshell the control is stuck in IE7 rendering mode for engine compatibility reasons by default. However, there is at least one way to fix this explicitly using Registry keys on a per application basis. The key point from that blog article is that you can override the IE rendering engine for a particular executable by setting one (or more) registry flags that tell the Windows Shell which version of the Internet Explorer rendering engine to load. An application that wishes to use a more recent version of Internet Explorer can then register itself during installation for the specific IE version desired and from then on the application will use that version of the Web Browser component. If the application is older than the specified version it falls back to the default version (IE 7 rendering). Forcing CHM files to display with IE9 (or later) Rendering Knowing that we can force the IE usage for a given process it's also possible to affect the CHM rendering by setting same keys on the executable that's hosting the CHM file. What that executable file is depends on the type of application as there are a number of ways that can launch the help engine. hh.exeThe standalone Windows CHM Help Viewer that launches when you launch a CHM from Windows Explorer. You can manually add hh.exe to the registry keys. YourApplication.exeIf you're using .NET or any tool that internally uses the hhControl ActiveX control to launch help content your application is your host. You should add your application's exe to the registry during application startup. foxhhelp9.exeIf you're building a FoxPro application that uses the built-in help features, foxhhelp9.exe is used to actually host the help controls. Make sure to add this executable to the registry. What to set You can configure the Internet Explorer version used for an application in the registry by specifying the executable file name and a value that specifies the IE version desired. There are two different sets of keys for 32 bit and 64 bit applications. 32 bit only or 64 bit: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Internet Explorer\MAIN\FeatureControl\FEATURE_BROWSER_EMULATION Value Key: hh.exe 32 bit on 64 bit machine: HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Internet Explorer\MAIN\FeatureControl\FEATURE_BROWSER_EMULATION Value Key: hh.exe Note that it's best to always set both values ideally when you install your application so it works regardless of which platform you run on. The value specified is a DWORD value and the interesting values are decimal 9000 for IE9 rendering mode depending on !DOCTYPE settings or 9999 for IE 9 standards mode always. You can use the same logic for 8000 and 8888 for IE8 and the final value of 7000 for IE7 (one has to wonder what they're going todo for version 10 to perpetuate that pattern). I think 9000 is the value you'd most likely want to use. 9000 means that IE9 will be used for rendering but unless the right doctypes are used (XHTML and HTML5 specifically) IE will still fall back into quirks mode as needed. This should allow existing pages to continue to use the fallback engine while new pages that have the proper HTML doctype set can take advantage of the newest features. Here's an example of how I set the registry keys in my Tarma Installmate registry configuration: Note that I set all three values both under the Software and Wow6432Node keys so that this works regardless of where these EXEs are launched from. Even though all apps are 32 bit apps, the 64 bit (the default one shown selected) key is often used. So, now once I've set the registry key for hh.exe I can now launch my CHM help file from Explorer and see the following CSS3 IE9 rendered display: Summary It sucks that we have to go through all these hoops to get what should be natural behavior for an application to support the latest features available on a system. But it shouldn't be a surprise - the Windows Help team (if there even is such a thing) has not been known for forward looking technologies. It's a pretty big hassle that we have to resort to setting registry keys in order to get the Web Browser control and the internal CHM engine to render itself properly but at least it's possible to make it work after all. Using this technique it's possible to ship an application with a help file and allow your CHM help to display with richer CSS markup and correct rendering using the stricter and more consistent XHTML or HTML5 doctypes. If you provide both Web help and in-application help (and why not if you're building from a single source) you now can side step the issue of your customers asking: Why does my help file look so much shittier than the online help… No more!© Rick Strahl, West Wind Technologies, 2005-2012Posted in HTML5  Help  Html Help Builder  Internet Explorer  Windows   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Oracle Products Reflect Key Trends Shaping Enterprise 2.0

    - by kellsey.ruppel(at)oracle.com
    Following up on his predictions for 2011, we asked Enterprise 2.0 veteran Andy MacMillan to map out the ways Oracle solutions are at the forefront of industry trends--and how Oracle customers can benefit in the coming year. 1. Increase organizational awareness | Oracle WebCenter Suite Oracle WebCenter Suite provides a unique set of capabilities to drive organizational awareness. In particular, the expansive activity graph connects users directly to key enterprise applications, activities, and interests. In this way, applicable and critical business information is automatically and immediately visible--in the context of key tasks--via real-time dashboards and comprehensive reporting. Oracle WebCenter Suite also integrates key E2.0 services, such as blogs, wikis, and RSS feeds, into critical business processes, including back-office systems of records such as ERP and CRM systems. 2. Drive online customer engagement | Oracle Real-Time Decisions With more and more business being conducted on the Web, driving increased online customer engagement becomes a critical key to success. This effort is usually spearheaded by an increasingly important executive role, the Head of Online, who usually reports directly to the CMO. To help manage the Web experience online, Oracle solutions are driving a new kind of intelligent social commerce by combining Oracle Universal Content Management, Oracle WebCenter Services, and Oracle Real-Time Decisions with leading e-commerce and product recommendations. Oracle Real-Time Decisions provides multichannel recommendations for content, products, and services--including seamless integration across Web, mobile, and social channels. The result: happier customers, increased customer acquisition and retention, and improved critical success metrics such as shopping cart abandonment. 3. Easily build composite applications | Oracle Application Development Framework Thanks to the shared user experience strategy across Oracle Fusion Middleware, Oracle Fusion Applications and many other Oracle Applications, customers can easily create real, customer-specific composite applications using Oracle WebCenter Suite and Oracle Application Development Framework. Oracle Application Development Framework components provide modular user interface components that can build rich, social composite applications. In addition, a broad set of components spanning BPM, SOA, ECM, and beyond can be quickly and easily incorporated into composite applications. 4. Integrate records management into a global content platform | Oracle Enterprise Content Management 11g Oracle Enterprise Content Management 11g provides leading records management capabilities as part of a unified ECM platform for managing records, documents, Web content, digital assets, enterprise imaging, and application imaging. This unique strategy provides comprehensive records management in a consistent, cost-effective way, and enables organizations to consolidate ECM repositories and connect ECM to critical business applications. 5. Achieve ECM at extreme scale | Oracle WebLogic Server and Oracle Exadata To support the high-performance demands of a unified and rationalized content platform, Oracle has pioneered highly scalable and high-performing ECM infrastructures. Two innovations in particular helped make this happen. The core ECM platform itself moved to an Enterprise Java architecture, so organizations can now use Oracle WebLogic Server for enhanced scalability and manageability. Oracle Enterprise Content Management 11g can leverage Oracle Exadata for extreme performance and scale. Likewise, Oracle Exalogic--Oracle's foundation for cloud computing--enables extreme performance for processor-intensive capabilities such as content conversion or dynamic Web page delivery. Learn more about Oracle's Enterprise 2.0 solutions.

    Read the article

  • Apache Sending "Content-Length : 0" , How to Fix ?

    - by ServerZilla
    Hi, I am using Apache server and it is sending Content-Length = 0 value which is preventing file-downloads, see - http://www.youtubedroid.com/download2.php?v=%5F3XcMEKNws0&title=Akhila+%2CMumbai+reloaded%2CSuper+dancer+2&hq=0 , here are my .htaccess content : SetEnv no-gzip dont-vary Here are headers sent by the server : HTTP/1.1 200 OK Date: Tue, 15 Dec 2009 06:12:11 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8e-fips-rhel5 mod_bwlimited/1.4 X-Powered-By: PHP/5.2.11 Content-Description: File Transfer Content-Disposition: attachment; filename="Akhila ,Mumbai reloaded,Super dancer 2.mp3" Content-Transfer-Encoding: binary Expires: 0 Cache-Control: must-revalidate, post-check=0, pre-check=0 Pragma: public X-Sendfile: ./tmp/64eb3b185e38af95c15405ffb0606e76.mp3 Content-Length: 0 Keep-Alive: timeout=5, max=95 Connection: Keep-Alive Content-Type: application/octet-stream Pls. tell how to fix this ?

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >