Search Results

Search found 4429 results on 178 pages for 'imagine digital'.

Page 23/178 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Connecting PC to TV via HDMI/DVI: Windows XP doesn't allow the appropriate screen resolution

    - by Jørgen
    I have a computer that is connected to the living room TV (a Panasonic) via HDMI. There is no other monitor connected. My problem is that the computer, which is running Windows XP, does not allow me to set the proper resolution for the TV. Both the graphics adapter and the TV should support the 1280x720 resolution, but it cannot be selected - the only available options are 1280x600 and 800x600, both in the "native" Windows dialog box and the custom Intel graphics options dialog box. Do anyone have a suggestion for a solution for this? Things I've thought of: Setting the resolution directly in the registry (where?) Installing some "custom" monitor driver (the TV manufacturer does not appear to provide any, currently the "generic" one is used) Details on the setup: Connection: DVI output on the computer via a passive DVI-HDMI adapter to the HDMI input on the TV, audio is run on a separate link, the TV is able to combine video and audio without any problem, the problem is there regardless of whether or not the audio is connected. The connection is several meters long through some walls, for this reason using a VGA cable instead is not an option. Note that the report explicitly says that the TV supports 1280x720. Still, I am not allowed to select it in Graphics Options, only 1280x600 and 800x600 is available. For 800x600, there's a lot of black around the edges; for 1280x600, the screen is "zoomed" so the edges of the monitor image (like the taskbar) is not visible. Other: The computer is running Windows XP. More recent versions of Windows are not an option (I have no licence). Linux is probably not an option (some of the video streaming sites I plan to use do not support it, I think) I wrote the rest of the details below. Thanks for any help!! TV: Panasonic TX-L32X10Y, European version; a 720p 32" quite "regular" LCD TV. Allowed resolutions according to manual: Signal name: 640x480 @60HZ Horizontal frequency: 31.47 kHz Vertical frequency: 60Hz Signal name: 750/720) /60p Horizontal frequency: 45.00 kHz Vertical frequency: 60Hz Signal name: 1,125 (1,080) / 60p Horizontal frequency: 67.50 kHz Vertical frequency: 60Hz (this is exactly how the manual presents it. PC via D-SUB (VGA cable) and "regular" HDMI have more alternatives.) Messing with the "zoom" settings on the TV does not affect the available resolution options on the computer. Computer: The following is a printout from one of the graphics adapter option pages. I think it covers most of it. The computer is a Dell. INTEL(R) EXTREME GRAPHICS 2 REPORT Report Date: 04/17/2011 Report Time[hr:mm:ss]: 20:18:02 Driver Version: 6.14.10.4396 Operating System: Windows XP* Professional, Service Pack 3 (5.1.2600) Default Language: English DirectX* Version: 9.0 Physical Memory: 1021 MB Minimum Graphics Memory: 1 MB Maximum Graphics Memory: 96 MB Graphics Memory in Use: 6 MB Processor: x86 Processor Speed: 2593 MHZ Vendor ID: 8086 Device ID: 2572 Device Revision: 02 * Accelerator Information * Accelerator in Use: Intel(R) 82865G Graphics Controller Video BIOS: 2972 Current Graphics Mode: 1280 by 600 True Color (60 Hz) * Devices Connected to the Graphics Accelerator * Active Digital Displays: 1 * Digital Display * Monitor Name: Plug and Play Monitor Display Type: Digital Gamma Value: 2.20 DDC2 Protocol: Supported Maximum Image Size: Horizontal: Not Available Vertical: Not Available Monitor Supported Modes: 1280 by 720 (50 Hz) 1280 by 720 (60 Hz) Display Power Management Support: Standby Mode: Not Supported Suspend Mode: Not Supported Active Off Mode: Not Supported (disclaimer: this question was also asked at the Wikipedia Reference Desk some time ago and might show up in a Google search. I got no useful answers there.)

    Read the article

  • Enable Media Streaming in Windows Home Server to Windows Media Player

    - by Mysticgeek
    One of the cool features of Windows Home Server is the ability to stream photos, music, and video to other computers on your network. Today we take a look at how to enable streaming in WHS to Windows Media Player in Vista and Windows 7. Turn on Media Streaming on WHS To enable Media Streaming from Windows Home Server, open the Windows Home Server Console and click on Settings. Now in the Setting screen select Media Sharing, then in the right column under Media Library Sharing turn on Library Sharing for the folders you want to stream.   If you have a Windows 7 machine on your network make sure media streaming is enabled. You should then see the server under Other Libraries and can start streaming your media collection.   Stream Video to Media Player 11 Now let’s say you want to stream videos to another member of your household who’s using a Vista machine in another room through Windows Media Player 11. Open WMP and click on Library then Media Sharing. Now click the box next to Find media that others are sharing then click Ok. Now you should see the server listed under Library…where in this example it’s geekserver. Since we only enabled Video streaming for this example, we need to click on the category icon and select Video. Now you can scroll through the available videos… And start enjoying your favorite videos streamed from the server through WMP 11 on Vista. Of course you can use this method to stream photos and music as well, you just need to enable what you want to stream from the Home Server Console. You can also stream your media to Windows Media Center and Xbox which we will be covering soon. Similar Articles Productive Geek Tips Share Digital Media With Other Computers on a Home Network with Windows 7Fixing When Windows Media Player Library Won’t Let You Add FilesGMedia Blog: Setting Up a Windows Home ServerShare and Stream Digital Media Between Windows 7 Machines On Your Home NetworkInstalling Windows Media Player Plugin for Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet Convert the Quick Launch Bar into a Super Application Launcher Automate Tasks in Linux with Crontab

    Read the article

  • Get Professional SEO Service From SEO Experts

    SEO service is imperative to online business. Business today has gone digital. With its power to perform international promotion, almost every brand is on a bid to establish an online presence. However, it's not easy to gain online presence without proper SEO service.

    Read the article

  • SQL SERVER – Introduction to Adaptive ETL Tool – How adaptive is your ETL?

    - by pinaldave
    I am often reminded by the fact that BI/data warehousing infrastructure is very brittle and not very adaptive to change. There are lots of basic use cases where data needs to be frequently loaded into SQL Server or another database. What I have found is that as long as the sources and targets stay the same, SSIS or any other ETL tool for that matter does a pretty good job handling these types of scenarios. But what happens when you are faced with more challenging scenarios, where the data formats and possibly the data types of the source data are changing from customer to customer?  Let’s examine a real life situation where a health management company receives claims data from their customers in various source formats. Even though this company supplied all their customers with the same claims forms, they ended up building one-off ETL applications to process the claims for each customer. Why, you ask? Well, it turned out that the claims data from various regional hospitals they needed to process had slightly different data formats, e.g. “integer” versus “string” data field definitions.  Moreover the data itself was represented with slight nuances, e.g. “0001124” or “1124” or “0000001124” to represent a particular account number, which forced them, as I eluded above, to build new ETL processes for each customer in order to overcome the inconsistencies in the various claims forms.  As a result, they experienced a lot of redundancy in these ETL processes and recognized quickly that their system would become more difficult to maintain over time. So imagine for a moment that you could use an ETL tool that helps you abstract the data formats so that your ETL transformation process becomes more reusable. Imagine that one claims form represents a data item as a string – acc_no(varchar) – while a second claims form represents the same data item as an integer – account_no(integer). This would break your traditional ETL process as the data mappings are hard-wired.  But in a world of abstracted definitions, all you need to do is create parallel data mappings to a common data representation used within your ETL application; that is, map both external data fields to a common attribute whose name and type remain unchanged within the application. acc_no(varchar) is mapped to account_number(integer) expressor Studio first claim form schema mapping account_no(integer) is also mapped to account_number(integer) expressor Studio second claim form schema mapping All the data processing logic that follows manipulates the data as an integer value named account_number. Well, these are the kind of problems that that the expressor data integration solution automates for you.  I’ve been following them since last year and encourage you to check them out by downloading their free expressor Studio ETL software. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: ETL, SSIS

    Read the article

  • Automapper: Handling NULL members

    - by PSteele
    A question about null members came up on the Automapper mailing list.  While the problem wasn’t with Automapper, investigating the issue led to an interesting feature in Automapper. Normally, Automapper ignores null members.  After all, what is there really to do?  Imagine these source classes: public class Source { public int Data { get; set; } public Address Address { get; set; } }   public class Destination { public string Data { get; set; } public Address Address { get; set; } }   public class Address { public string AddressType { get; set; } public string Location { get; set; } } And imagine a simple mapping example with these classes: Mapper.CreateMap<Source, Destination>();   var source = new Source { Data = 22, Address = new Address { AddressType = "Home", Location = "Michigan", }, };   var dest = Mapper.Map<Source, Destination>(source); The variable ‘dest’ would have a complete mapping of the Data member and the Address member. But what if the source had no address? Mapper.CreateMap<Source, Destination>();   var source = new Source { Data = 22, };   var dest = Mapper.Map<Source, Destination>(source); In that case, Automapper would just leave the Destination.Address member null as well.  But what if we always wanted an Address defined – even if it’s just got some default data?  Use the “NullSubstitute” option: Mapper.CreateMap<Source, Destination>() .ForMember(d => d.Address, o => o.NullSubstitute(new Address { AddressType = "Unknown", Location = "Unknown", }));   var source = new Source { Data = 22, };   var dest = Mapper.Map<Source, Destination>(source); Now, the ‘dest’ variable will have an Address defined with a type and location of “Unknown”.  Very handy! Technorati Tags: .NET,Automapper,NULL

    Read the article

  • SCALE 8x: Color management for everyone

    <b>LWN.net:</b> "Color management is sometimes unfairly characterized as a topic of interest only to print shops and video editors, but as Cruz explained at the top of his talk, anyone who shares digital content wants it to look correct, and everyone who uses more than one device knows how tricky that can be."

    Read the article

  • Un stagiaire de SAP remporte le concours du « Meilleur Développeur de France » à l'Ecole 42, Salesforce.com lui verse 10.000 Euro

    Un stagiaire de SAP remporte le concours du « Meilleur développeur de France » A l'école 42, Salesforce lui offre 10.000 eurosL'évènement a attiré du beau monde. Il faut dire que le concours du « Meilleur Développeur de France », dont la première édition a eu lieu la semaine dernière, a été particulièrement bien orchestrée par la société Going to Digital, dont un des directeurs associés est allé à bonne école en passant par une filiale de Rentabiliweb, la société de marketing numérique de l'énigmatique...

    Read the article

  • Delete directory by referencing symbolic link

    - by Adam
    To set up the question, imagine this scenario: mkdir ~/temp cd ~/ ln -s temp temporary rm -rf temporary, rm -f temporary, and rm temporary each will remove the symbolic link but leave the directory ~/temp/. I have a script where the name of the symbolic link is easily derived but the name of the linked directory is not. Is there a way to remove the directory by referencing the symbolic link, short of parsing the name of the directory from ls -od ~/temporary?

    Read the article

  • Week in Geek: Internet Service Providers to Implement New Anti-Piracy Monitoring in July

    - by Asian Angel
    Our latest edition of WIG is filled with news link goodness such as Google’s plans for a Metro version of Chrome, Microsoft’s seeking of a patent for TV-viewing tolls, Encyclopaedia Britannica’s switch to a digital only format, and more. Screenshot by Asian Angel. Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos HTG Explains: What Can You Find in an Email Header?

    Read the article

  • It's Coming: Chalk Talk with John

    - by Tanu Sood
    ...John Brunswick that is. Who is this John Brunswick, you ask? John Brunswick is an Enterprise Architect with Oracle. As an Oracle Enterprise Architect, John focuses on the alignment of technical capabilities in support of business vision and objectives, as well as the overall business value of technology. What's more he is pretty handy with animation and digital videos as you will see shortly. Starting tomorrow, we will host a bi-weekly column with John called "Chalk Talk with John". Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} In our "Chalk Talk with John" series, John will leverage his skills, experience and expertise (& his passion in digital animation) to discuss technology in business terms or as he puts it "so my ma understands what I do for a living". Through this series, John will explore the practical value of Middleware in the context of two fictional communities, shared through analogies aligned to enterprise technology.  This format offers business stakeholders and IT a common language for understanding the benefits of technology in support of their business initiatives, regardless of their current level of technical knowledge. So, be sure to tune in tomorrow and every 2 weeks for "Chalk Talk with John".

    Read the article

  • Silverlight Cream for February 23, 2011 -- #1051

    - by Dave Campbell
    In this Issue: Ian T. Lackey, Kevin Hoffman, Kunal Chowdhury, Jesse Liberty(-2-), Page Brooks, Deborah Kurata(-2-), and Paul Sheriff. Above the Fold: Silverlight: "Building a Radar Control in Silverlight–Part 2" Page Brooks WP7: "Reactive Drag and Drop Part 2" Jesse Liberty Expression Blend: "Simple RadioButtonList and / or CheckBoxList in Silverlight Using a Behavior" Ian T. Lackey Shoutouts: Kunal Chowdhury delivered a full day session on Silverlight at the Microsoft Imagine Cup Championship event in Mumbai... you can Download Microsoft Imagine Cup Session PPT on Silverlight Dennis Doomen has appeared in my blog any number of times... he's looking for some assistance: Get me on stage on the Developer Days 2011 Steve Wortham posted An Interview with Jeff Wilcox From SilverlightCream.com: Simple RadioButtonList and / or CheckBoxList in Silverlight Using a Behavior Ian T. Lackey bemoans the lack of a RadioButtonList or CheckBoxList, and jumps into Blend to show us how to make one using a behavior... and the code is available too! WP7 for iPhone and Android Developers - Introduction to XAML and Silverlight Continuing his series at SilvelightShow for iPhone and Android devs, Kevin Hoffman has part 2 up getting into the UI with an intro to XAML and Silverlight. Day 1: Working with Telerik Silverlight RadControls Kunal Chowdhury kicked my tires that I had missed his Telerik control series... He's detailing his experience getting up to speed with the Silverlight RadControls. Day 1 is intro, what there is, installing, stuff like that. Part 2 continues: Day 2: Working with BusyIndicator of Telerik Silverlight RadControls, followed (so far) by part 3: Day 3: Working with Masked TextBox of Telerik Silverlight RadControls Reactive Drag and Drop Part 2 Jesse Liberty has his 7th part about Rx up ... and the 2nd part of Reactive Drag and Drop, and oh yeah... it's for WP7 as well! Yet Another Podcast #25–Glenn Block / WCF Next Jesse Liberty has Glenn Block on stage for his Yet Another Podcast number 25... talking WCF with Glenn. Building a Radar Control in Silverlight–Part 2 Page Brooks has part 2 of his 'radar' control for Silverlight up... I don't know where I'd use this, but it's darned cool... and the live demo is amazing. Silverlight Charting: Setting Colors Deborah Kurata is looking at the charting controls now, and how to set colors. She begins with a previous post on charts and adds color definitions to that post. Silverlight Charting: Setting the Tooltip Deborah Kurata next gets into formatting the tooltip you can get when the user hovers over a chart to make it make more sense to your user 'Content' is NOT 'Text' in XAML Paul Sheriff discusses the Content property of XAML controls and how it can be pretty much any other XAML you want it to be, then goes on to show some nice examples. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Google I/O 2011: JavaScript Programming in the Large with Closure Tools

    Google I/O 2011: JavaScript Programming in the Large with Closure Tools Michael Bolin Most developers who have tinkered with JavaScript could not imagine writing 1000 lines of code in such a language, let alone 100000. Yet that is exactly what Google engineers have done using a suite of JavaScript tools named "Closure" to produce many of the most popular and sophisticated applications on the Web, such as Gmail and Google Maps. From: GoogleDevelopers Views: 4915 35 ratings Time: 57:07 More in Science & Technology

    Read the article

  • Security Issues with Single Page Apps

    - by Stephen.Walther
    Last week, I was asked to do a code review of a Single Page App built using the ASP.NET Web API, Durandal, and Knockout (good stuff!). In particular, I was asked to investigate whether there any special security issues associated with building a Single Page App which are not present in the case of a traditional server-side ASP.NET application. In this blog entry, I discuss two areas in which you need to exercise extra caution when building a Single Page App. I discuss how Single Page Apps are extra vulnerable to both Cross-Site Scripting (XSS) attacks and Cross-Site Request Forgery (CSRF) attacks. This goal of this blog post is NOT to persuade you to avoid writing Single Page Apps. I’m a big fan of Single Page Apps. Instead, the goal is to ensure that you are fully aware of some of the security issues related to Single Page Apps and ensure that you know how to guard against them. Cross-Site Scripting (XSS) Attacks According to WhiteHat Security, over 65% of public websites are open to XSS attacks. That’s bad. By taking advantage of XSS holes in a website, a hacker can steal your credit cards, passwords, or bank account information. Any website that redisplays untrusted information is open to XSS attacks. Let me give you a simple example. Imagine that you want to display the name of the current user on a page. To do this, you create the following server-side ASP.NET page located at http://MajorBank.com/SomePage.aspx: <%@Page Language="C#" %> <html> <head> <title>Some Page</title> </head> <body> Welcome <%= Request["username"] %> </body> </html> Nothing fancy here. Notice that the page displays the current username by using Request[“username”]. Using Request[“username”] displays the username regardless of whether the username is present in a cookie, a form field, or a query string variable. Unfortunately, by using Request[“username”] to redisplay untrusted information, you have now opened your website to XSS attacks. Here’s how. Imagine that an evil hacker creates the following link on another website (hackers.com): <a href="/SomePage.aspx?username=<script src=Evil.js></script>">Visit MajorBank</a> Notice that the link includes a query string variable named username and the value of the username variable is an HTML <SCRIPT> tag which points to a JavaScript file named Evil.js. When anyone clicks on the link, the <SCRIPT> tag will be injected into SomePage.aspx and the Evil.js script will be loaded and executed. What can a hacker do in the Evil.js script? Anything the hacker wants. For example, the hacker could display a popup dialog on the MajorBank.com site which asks the user to enter their password. The script could then post the password back to hackers.com and now the evil hacker has your secret password. ASP.NET Web Forms and ASP.NET MVC have two automatic safeguards against this type of attack: Request Validation and Automatic HTML Encoding. Protecting Coming In (Request Validation) In a server-side ASP.NET app, you are protected against the XSS attack described above by a feature named Request Validation. If you attempt to submit “potentially dangerous” content — such as a JavaScript <SCRIPT> tag — in a form field or query string variable then you get an exception. Unfortunately, Request Validation only applies to server-side apps. Request Validation does not help in the case of a Single Page App. In particular, the ASP.NET Web API does not pay attention to Request Validation. You can post any content you want – including <SCRIPT> tags – to an ASP.NET Web API action. For example, the following HTML page contains a form. When you submit the form, the form data is submitted to an ASP.NET Web API controller on the server using an Ajax request: <!DOCTYPE html> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title></title> </head> <body> <form data-bind="submit:submit"> <div> <label> User Name: <input data-bind="value:user.userName" /> </label> </div> <div> <label> Email: <input data-bind="value:user.email" /> </label> </div> <div> <input type="submit" value="Submit" /> </div> </form> <script src="Scripts/jquery-1.7.1.js"></script> <script src="Scripts/knockout-2.1.0.js"></script> <script> var viewModel = { user: { userName: ko.observable(), email: ko.observable() }, submit: function () { $.post("/api/users", ko.toJS(this.user)); } }; ko.applyBindings(viewModel); </script> </body> </html> The form above is using Knockout to bind the form fields to a view model. When you submit the form, the view model is submitted to an ASP.NET Web API action on the server. Here’s the server-side ASP.NET Web API controller and model class: public class UsersController : ApiController { public HttpResponseMessage Post(UserViewModel user) { var userName = user.UserName; return Request.CreateResponse(HttpStatusCode.OK); } } public class UserViewModel { public string UserName { get; set; } public string Email { get; set; } } If you submit the HTML form, you don’t get an error. The “potentially dangerous” content is passed to the server without any exception being thrown. In the screenshot below, you can see that I was able to post a username form field with the value “<script>alert(‘boo’)</script”. So what this means is that you do not get automatic Request Validation in the case of a Single Page App. You need to be extra careful in a Single Page App about ensuring that you do not display untrusted content because you don’t have the Request Validation safety net which you have in a traditional server-side ASP.NET app. Protecting Going Out (Automatic HTML Encoding) Server-side ASP.NET also protects you from XSS attacks when you render content. By default, all content rendered by the razor view engine is HTML encoded. For example, the following razor view displays the text “<b>Hello!</b>” instead of the text “Hello!” in bold: @{ var message = "<b>Hello!</b>"; } @message   If you don’t want to render content as HTML encoded in razor then you need to take the extra step of using the @Html.Raw() helper. In a Web Form page, if you use <%: %> instead of <%= %> then you get automatic HTML Encoding: <%@ Page Language="C#" %> <% var message = "<b>Hello!</b>"; %> <%: message %> This automatic HTML Encoding will prevent many types of XSS attacks. It prevents <script> tags from being rendered and only allows &lt;script&gt; tags to be rendered which are useless for executing JavaScript. (This automatic HTML encoding does not protect you from all forms of XSS attacks. For example, you can assign the value “javascript:alert(‘evil’)” to the Hyperlink control’s NavigateUrl property and execute the JavaScript). The situation with Knockout is more complicated. If you use the Knockout TEXT binding then you get HTML encoded content. On the other hand, if you use the HTML binding then you do not: <!-- This JavaScript DOES NOT execute --> <div data-bind="text:someProp"></div> <!-- This Javacript DOES execute --> <div data-bind="html:someProp"></div> <script src="Scripts/jquery-1.7.1.js"></script> <script src="Scripts/knockout-2.1.0.js"></script> <script> var viewModel = { someProp : "<script>alert('Evil!')<" + "/script>" }; ko.applyBindings(viewModel); </script>   So, in the page above, the DIV element which uses the TEXT binding is safe from XSS attacks. According to the Knockout documentation: “Since this binding sets your text value using a text node, it’s safe to set any string value without risking HTML or script injection.” Just like server-side HTML encoding, Knockout does not protect you from all types of XSS attacks. For example, there is nothing in Knockout which prevents you from binding JavaScript to a hyperlink like this: <a data-bind="attr:{href:homePageUrl}">Go</a> <script src="Scripts/jquery-1.7.1.min.js"></script> <script src="Scripts/knockout-2.1.0.js"></script> <script> var viewModel = { homePageUrl: "javascript:alert('evil!')" }; ko.applyBindings(viewModel); </script> In the page above, the value “javascript:alert(‘evil’)” is bound to the HREF attribute using Knockout. When you click the link, the JavaScript executes. Cross-Site Request Forgery (CSRF) Attacks Cross-Site Request Forgery (CSRF) attacks rely on the fact that a session cookie does not expire until you close your browser. In particular, if you visit and login to MajorBank.com and then you navigate to Hackers.com then you will still be authenticated against MajorBank.com even after you navigate to Hackers.com. Because MajorBank.com cannot tell whether a request is coming from MajorBank.com or Hackers.com, Hackers.com can submit requests to MajorBank.com pretending to be you. For example, Hackers.com can post an HTML form from Hackers.com to MajorBank.com and change your email address at MajorBank.com. Hackers.com can post a form to MajorBank.com using your authentication cookie. After your email address has been changed, by using a password reset page at MajorBank.com, a hacker can access your bank account. To prevent CSRF attacks, you need some mechanism for detecting whether a request is coming from a page loaded from your website or whether the request is coming from some other website. The recommended way of preventing Cross-Site Request Forgery attacks is to use the “Synchronizer Token Pattern” as described here: https://www.owasp.org/index.php/Cross-Site_Request_Forgery_%28CSRF%29_Prevention_Cheat_Sheet When using the Synchronizer Token Pattern, you include a hidden input field which contains a random token whenever you display an HTML form. When the user opens the form, you add a cookie to the user’s browser with the same random token. When the user posts the form, you verify that the hidden form token and the cookie token match. Preventing Cross-Site Request Forgery Attacks with ASP.NET MVC ASP.NET gives you a helper and an action filter which you can use to thwart Cross-Site Request Forgery attacks. For example, the following razor form for creating a product shows how you use the @Html.AntiForgeryToken() helper: @model MvcApplication2.Models.Product <h2>Create Product</h2> @using (Html.BeginForm()) { @Html.AntiForgeryToken(); <div> @Html.LabelFor( p => p.Name, "Product Name:") @Html.TextBoxFor( p => p.Name) </div> <div> @Html.LabelFor( p => p.Price, "Product Price:") @Html.TextBoxFor( p => p.Price) </div> <input type="submit" /> } The @Html.AntiForgeryToken() helper generates a random token and assigns a serialized version of the same random token to both a cookie and a hidden form field. (Actually, if you dive into the source code, the AntiForgeryToken() does something a little more complex because it takes advantage of a user’s identity when generating the token). Here’s what the hidden form field looks like: <input name=”__RequestVerificationToken” type=”hidden” value=”NqqZGAmlDHh6fPTNR_mti3nYGUDgpIkCiJHnEEL59S7FNToyyeSo7v4AfzF2i67Cv0qTB1TgmZcqiVtgdkW2NnXgEcBc-iBts0x6WAIShtM1″ /> And here’s what the cookie looks like using the Google Chrome developer toolbar: You use the [ValidateAntiForgeryToken] action filter on the controller action which is the recipient of the form post to validate that the token in the hidden form field matches the token in the cookie. If the tokens don’t match then validation fails and you can’t post the form: public ActionResult Create() { return View(); } [ValidateAntiForgeryToken] [HttpPost] public ActionResult Create(Product productToCreate) { if (ModelState.IsValid) { // save product to db return RedirectToAction("Index"); } return View(); } How does this all work? Let’s imagine that a hacker has copied the Create Product page from MajorBank.com to Hackers.com – the hacker grabs the HTML source and places it at Hackers.com. Now, imagine that the hacker trick you into submitting the Create Product form from Hackers.com to MajorBank.com. You’ll get the following exception: The Cross-Site Request Forgery attack is blocked because the anti-forgery token included in the Create Product form at Hackers.com won’t match the anti-forgery token stored in the cookie in your browser. The tokens were generated at different times for different users so the attack fails. Preventing Cross-Site Request Forgery Attacks with a Single Page App In a Single Page App, you can’t prevent Cross-Site Request Forgery attacks using the same method as a server-side ASP.NET MVC app. In a Single Page App, HTML forms are not generated on the server. Instead, in a Single Page App, forms are loaded dynamically in the browser. Phil Haack has a blog post on this topic where he discusses passing the anti-forgery token in an Ajax header instead of a hidden form field. He also describes how you can create a custom anti-forgery token attribute to compare the token in the Ajax header and the token in the cookie. See: http://haacked.com/archive/2011/10/10/preventing-csrf-with-ajax.aspx Also, take a look at Johan’s update to Phil Haack’s original post: http://johan.driessen.se/posts/Updated-Anti-XSRF-Validation-for-ASP.NET-MVC-4-RC (Other server frameworks such as Rails and Django do something similar. For example, Rails uses an X-CSRF-Token to prevent CSRF attacks which you generate on the server – see http://excid3.com/blog/rails-tip-2-include-csrf-token-with-every-ajax-request/#.UTFtgDDkvL8 ). For example, if you are creating a Durandal app, then you can use the following razor view for your one and only server-side page: @{ Layout = null; } <!DOCTYPE html> <html> <head> <title>Index</title> </head> <body> @Html.AntiForgeryToken() <div id="applicationHost"> Loading app.... </div> @Scripts.Render("~/scripts/vendor") <script type="text/javascript" src="~/App/durandal/amd/require.js" data-main="/App/main"></script> </body> </html> Notice that this page includes a call to @Html.AntiForgeryToken() to generate the anti-forgery token. Then, whenever you make an Ajax request in the Durandal app, you can retrieve the anti-forgery token from the razor view and pass the token as a header: var csrfToken = $("input[name='__RequestVerificationToken']").val(); $.ajax({ headers: { __RequestVerificationToken: csrfToken }, type: "POST", dataType: "json", contentType: 'application/json; charset=utf-8', url: "/api/products", data: JSON.stringify({ name: "Milk", price: 2.33 }), statusCode: { 200: function () { alert("Success!"); } } }); Use the following code to create an action filter which you can use to match the header and cookie tokens: using System.Linq; using System.Net.Http; using System.Web.Helpers; using System.Web.Http.Controllers; namespace MvcApplication2.Infrastructure { public class ValidateAjaxAntiForgeryToken : System.Web.Http.AuthorizeAttribute { protected override bool IsAuthorized(HttpActionContext actionContext) { var headerToken = actionContext .Request .Headers .GetValues("__RequestVerificationToken") .FirstOrDefault(); ; var cookieToken = actionContext .Request .Headers .GetCookies() .Select(c => c[AntiForgeryConfig.CookieName]) .FirstOrDefault(); // check for missing cookie or header if (cookieToken == null || headerToken == null) { return false; } // ensure that the cookie matches the header try { AntiForgery.Validate(cookieToken.Value, headerToken); } catch { return false; } return base.IsAuthorized(actionContext); } } } Notice that the action filter derives from the base AuthorizeAttribute. The ValidateAjaxAntiForgeryToken only works when the user is authenticated and it will not work for anonymous requests. Add the action filter to your ASP.NET Web API controller actions like this: [ValidateAjaxAntiForgeryToken] public HttpResponseMessage PostProduct(Product productToCreate) { // add product to db return Request.CreateResponse(HttpStatusCode.OK); } After you complete these steps, it won’t be possible for a hacker to pretend to be you at Hackers.com and submit a form to MajorBank.com. The header token used in the Ajax request won’t travel to Hackers.com. This approach works, but I am not entirely happy with it. The one thing that I don’t like about this approach is that it creates a hard dependency on using razor. Your single page in your Single Page App must be generated from a server-side razor view. A better solution would be to generate the anti-forgery token in JavaScript. Unfortunately, until all browsers support a way to generate cryptographically strong random numbers – for example, by supporting the window.crypto.getRandomValues() method — there is no good way to generate anti-forgery tokens in JavaScript. So, at least right now, the best solution for generating the tokens is the server-side solution with the (regrettable) dependency on razor. Conclusion The goal of this blog entry was to explore some ways in which you need to handle security differently in the case of a Single Page App than in the case of a traditional server app. In particular, I focused on how to prevent Cross-Site Scripting and Cross-Site Request Forgery attacks in the case of a Single Page App. I want to emphasize that I am not suggesting that Single Page Apps are inherently less secure than server-side apps. Whatever type of web application you build – regardless of whether it is a Single Page App, an ASP.NET MVC app, an ASP.NET Web Forms app, or a Rails app – you must constantly guard against security vulnerabilities.

    Read the article

  • My Thoughts On the Xbox 180

    - by Chris Gardner
    Originally posted on: http://geekswithblogs.net/freestylecoding/archive/2013/06/21/my-thoughts-on-the-xbox-180.aspx Everyone seems to be putting their 0.00237 cents into the wishing well over Microsoft's recent decision to reverse the DRM policy on the Xbox One. However, there have been a few issues that nobody has touched. As such, I have decided to dig 0.00237 cents out of my pocket. First, let me be clear about this point. I do not support the decision to reverse the DRM policy on the Xbox One. I wanted that point to be expressed first and unambiguously. I will say it again. I do not support the decision to reverse the DRM policy on the Xbox One. Now that I have that out of the way, let me go into my rationale. This decision removes most of the cool features that enticed me to pre-order the console. No, I didn't cancel my pre-order. There is still five months before the release of the console, and there is still a plethora of information that we, as consumers, do not have. With that, it should be noted that much of the talk in this post is speculation and rhetoric. I do not have any insider information that you do not possess. The persistent connection would have allowed the console to do many of the functions for which we have been begging. That demo where someone was playing Ryse, seamlessly accepted a multiplayer challenge in Killer Instinct, played the match (and a rematch,) and then jumped back into Ryse. That's gone, if you bought the game on disc. The new, DRM free system will require the disc in the system to play a game. That bullet point where one Xbox Live account could have up to 10 slave accounts so families could play together, no matter where they were located. That's gone as well. The promise of huge, expansive, dynamically changing worlds that was brought to us with the power of cloud computing. Well, "the people" didn't want there to be a forced, persistent connection. As such, developers can't rely on a connection and, as such, that feature is gone. This is akin to the removal of the hard drive on the Xbox 360. The list continues, but the enthusiast press has enumerated the list far better than I wish. All of this is because the Xbox team saw the HUGE success of Steam and decided to borrow a few ideas. Yes, Steam. The service that everyone hated for the first six months (for the same reasons the Xbox One is getting flack.) There was an initial growing pain. However, it is now lauded as the way games distribution should be handled. Unless you are Microsoft. I do find it curious that many of the features were originally announced for the PS4 during its unveiling. However, much of that was left strangely absent for Sony's E3 press conference. Instead, we received a single, static slide that basically said the exact opposite of Microsoft's plans. It is not farfetched to believe that slide came into existence during the approximately seven hours between the two media briefings. The thing that majorly annoys me over this whole kerfuffle is that the single thing that caused the call to arms is, really, not an issue. Microsoft never said they were going to block used sales. They said it was up to the publisher to make that decision. This would have allowed publishers to reclaim some of the costs of development in subsequent sales of the product. If you sell your game to GameStop for 7 USD, GameStop is going to sell it for 55 USD. That is 48 USD pure profit for them. Some publishers asked GameStop for a small cut. Was this a huge, money grubbing scheme? Well, yes, but the idea was that they have to handle server infrastructure for dormant accounts, etc. Of course, GameStop flatly refused, and the Online Pass was born. Fortunately, this trend didn’t last, and most publishers have stopped the practice. The ability to sell "licenses" has already begun to be challenged. Are you living in the EU? If so, companies must allow you to sell digital property. With this precedent in place, it's only a matter of time before other areas follow suit. If GameStop were smart, they should have immediately contacted every publisher out there to get the rights to become a clearing house for these licenses. Then, they keep their business model and could reduce their brick and mortar footprint. The digital landscape is changing. We need to not block this process. As Seth MacFarlane best said "Some issues are so important that you should drag people kicking and screaming." I believe this was said on an episode of Real Time with Bill Maher about the issue of Gay Marriages. Much like the original source, this is an issue that we need to drag people to the correct, progressive position. Microsoft, as a company, actually has the resources to weather the transition period. They have a great pool of first and second party developers that can leverage this new framework to prove the validity. Over time, the third party developers will get excited to use these tools. As an old C++ guy, I resisted C# for years. Now, I think it's one of the best languages I've ever used. I have a server room and a Co-Lo full of servers, so I originally didn't see the value in Azure. Now, I wish I could move every one of my projects into the cloud. I still LOVE getting physical packaging, which my music and games collection will proudly attest. However, I have started to see the value in pure digital, and have found ways to integrate this into the ways I consume those products. I can, honestly, understand how some parts of the population would be very apprehensive about this new landscape. There were valid arguments about people with no internet access. There are ways to combat these problems. These methods do not require us to throw the baby out with the bathwater. However, the number of people in the computer industry that I have seen cry foul is truly appalling. We are the forward looking people that help show how technology can improve people's lives. If we can't see the value of the brief pain involved with an exciting new ecosystem, than who will?

    Read the article

  • European Interoperability Framework - a new beginning?

    - by trond-arne.undheim
    The most controversial document in the history of the European Commission's IT policy is out. EIF is here, wrapped in the Communication "Towards interoperability for European public services", and including the new feature European Interoperability Strategy (EIS), arguably a higher strategic take on the same topic. Leaving EIS aside for a moment, the EIF controversy has been around IPR, defining open standards and about the proper terminology around standardization deliverables. Today, as the document finally emerges, what is the verdict? First of all, to be fair to those among you who do not spend your lives in the intricate labyrinths of Commission IT policy documents on interoperability, let's define what we are talking about. According to the Communication: "An interoperability framework is an agreed approach to interoperability for organisations that want to collaborate to provide joint delivery of public services. Within its scope of applicability, it specifies common elements such as vocabulary, concepts, principles, policies, guidelines, recommendations, standards, specifications and practices." The Good - EIF reconfirms that "The Digital Agenda can only take off if interoperability based on standards and open platforms is ensured" and also confirms that "The positive effect of open specifications is also demonstrated by the Internet ecosystem." - EIF takes a productive and pragmatic stance on openness: "In the context of the EIF, openness is the willingness of persons, organisations or other members of a community of interest to share knowledge and stimulate debate within that community, the ultimate goal being to advance knowledge and the use of this knowledge to solve problems" (p.11). "If the openness principle is applied in full: - All stakeholders have the same possibility of contributing to the development of the specification and public review is part of the decision-making process; - The specification is available for everybody to study; - Intellectual property rights related to the specification are licensed on FRAND terms or on a royalty-free basis in a way that allows implementation in both proprietary and open source software" (p. 26). - EIF is a formal Commission document. The former EIF 1.0 was a semi-formal deliverable from the PEGSCO, a working group of Member State representatives. - EIF tackles interoperability head-on and takes a clear stance: "Recommendation 22. When establishing European public services, public administrations should prefer open specifications, taking due account of the coverage of functional needs, maturity and market support." - The Commission will continue to support the National Interoperability Framework Observatory (NIFO), reconfirming the importance of coordinating such approaches across borders. - The Commission will align its internal interoperability strategy with the EIS through the eCommission initiative. - One cannot stress the importance of using open standards enough, whether in the context of open source or non-open source software. The EIF seems to have picked up on this fact: What does the EIF says about the relation between open specifications and open source software? The EIF introduces, as one of the characteristics of an open specification, the requirement that IPRs related to the specification have to be licensed on FRAND terms or on a royalty-free basis in a way that allows implementation in both proprietary and open source software. In this way, companies working under various business models can compete on an equal footing when providing solutions to public administrations while administrations that implement the standard in their own software (software that they own) can share such software with others under an open source licence if they so decide. - EIF is now among the center pieces of the Digital Agenda (even though this demands extensive inter-agency coordination in the Commission): "The EIS and the EIF will be maintained under the ISA Programme and kept in line with the results of other relevant Digital Agenda actions on interoperability and standards such as the ones on the reform of rules on implementation of ICT standards in Europe to allow use of certain ICT fora and consortia standards, on issuing guidelines on essential intellectual property rights and licensing conditions in standard-setting, including for ex-ante disclosure, and on providing guidance on the link between ICT standardisation and public procurement to help public authorities to use standards to promote efficiency and reduce lock-in.(Communication, p.7)" All in all, quite a few good things have happened to the document in the two years it has been on the shelf or was being re-written, depending on your perspective, in any case, awaiting the storms to calm. The Bad - While a certain pragmatism is required, and governments cannot migrate to full openness overnight, EIF gives a bit too much room for governments not to apply the openness principle in full. Plenty of reasons are given, which should maybe have been put as challenges to be overcome: "However, public administrations may decide to use less open specifications, if open specifications do not exist or do not meet functional interoperability needs. In all cases, specifications should be mature and sufficiently supported by the market, except if used in the context of creating innovative solutions". - EIF does not use the internationally established terminology: open standards. Rather, the EIF introduces the notion of "formalised specification". How do "formalised specifications" relate to "standards"? According to the FAQ provided: The word "standard" has a specific meaning in Europe as defined by Directive 98/34/EC. Only technical specifications approved by a recognised standardisation body can be called a standard. Many ICT systems rely on the use of specifications developed by other organisations such as a forum or consortium. The EIF introduces the notion of "formalised specification", which is either a standard pursuant to Directive 98/34/EC or a specification established by ICT fora and consortia. The term "open specification" used in the EIF, on the one hand, avoids terminological confusion with the Directive and, on the other, states the main features that comply with the basic principle of openness laid down in the EIF for European Public Services. Well, this may be somewhat true, but in reality, Europe is 30 year behind in terminology. Unless the European Standardization Reform gets completed in the next few months, most Member States will likely conclude that they will go on referencing and using standards beyond those created by the three European endorsed monopolists of standardization, CEN, CENELEC and ETSI. Who can afford to begin following the strict Brussels rules for what they can call open standards when, in reality, standards stemming from global standardization organizations, so-called fora/consortia, dominate in the IT industry. What exactly is EIF saying? Does it encourage Member States to go on using non-ESO standards as long as they call it something else? I guess I am all for it, although it is a bit cumbersome, no? Why was there so much interest around the EIF? The FAQ attempts to explain: Some Member States have begun to adopt policies to achieve interoperability for their public services. These actions have had a significant impact on the ecosystem built around the provision of such services, e.g. providers of ICT goods and services, standardisation bodies, industry fora and consortia, etc... The Commission identified a clear need for action at European level to ensure that actions by individual Member States would not create new electronic barriers that would hinder the development of interoperable European public services. As a result, all stakeholders involved in the delivery of electronic public services in Europe have expressed their opinions on how to increase interoperability for public services provided by the different public administrations in Europe. Well, it does not take two years to read 50 consultation documents, and the EU Standardization Reform is not yet completed, so, more pragmatically, you finally had to release the document. Ok, let's leave some of that aside because the document is out and some people are happy (and others definitely not). The Verdict Considering the controversy, the delays, the lobbying, and the interests at stake both in the EU, in Member States and among vendors large and small, this document is pretty impressive. As with a good wine that has not yet come to full maturity, let's say that it seems to be coming in in the 85-88/100 range, but only a more fine-grained analysis, enjoyment in good company, and ultimately, implementation, will tell. The European Commission has today adopted a significant interoperability initiative to encourage public administrations across the EU to maximise the social and economic potential of information and communication technologies. Today, we should rally around this achievement. Tomorrow, let's sit down and figure out what it means for the future.

    Read the article

  • Antenna Aligner Part 10: Updates and emails…

    - by Chris George
    Since my last post back in July, I’ve not done huge amounts of work on my app for two reasons. Firstly, no time! Secondly, I wanted to leave it out in the wild for a while and see what happened. Well, what happened?  over 1,300 users, that’s what’s happened!  This uptake is beyond my wildest expectations, and apart from a couple of issues that I’ll mention in a minute, most of the feedback has been very positive indeed! I’ve had several emails giving me feedback and reporting issues, all of which I have made a point of replying to immediately. This act alone has met with favourable replies! One of the main issues was with iPad. So it turns out that my app is only accurate in portrait mode. Turning it into landscape will offset the direction by +-90degrees! Whoops! I think I’ve fixed this by disabling the orientation switching, but I have not yet had an iPad to test this on. I had several emails from iPod Touch users claiming the app did not work for them. Specifically, the compass view did not work. On investigation, it turns out that the iPod Touch does not have the compass hardware required to do this. Unfortunately there is no way to exclude iPod Touch’s from the list of supported devices, so I’ve just had to make it very clear in the itunes description that the device is not fully supported.  You can still get the list of transmitters, but you then have to use a real compass to get the bearing. But that’s not the end of the world. Several customers have requested the aerial polarisation to be displayed in the app. I was already working on this, and the data was already there, it was just a case of displaying this in the UI. I have a solution now, and this will be in the next release. Of course, with the Digital switchover in full swing across the UK, there have been one set of data updates (in 1.0.3), and another is due shortly. This reflects the transmitters as they switch over the digital fully and their power output increased. So all in all I’m very pleased with the feedback I’ve had, and I’m looking to get the next release out there by early December (allowing for the 2-3 week Apple approval lag!)  

    Read the article

  • IT Admin for Thrill Seekers

    - by Tony Davis
    A developer suggested to me recently that the life of the DBA was, surely, a dull one. My first reaction was indignation, but quickly followed by the thought that for many people excitement isn't necessarily the most desirable aspect of their job. It's true that some aspects of the DBA role seem guaranteed to quieten the pulse; in the days of tape backups, time must have slowed to eternity for the person whose job it was to oversee this process, placing tapes into secure containers, ensuring correct labeling, and.sorry, I drifted off there for a second. On the other hand, if you follow the adventures of the likes of Brent Ozar or Tom LaRock, you'd be forgiven for thinking that much of a database guy's time is spent, metaphorically, diving through plate glass windows in tight fitting underwear in order to extract grateful occupants from burning database applications. Alas it isn't true of the majority, but it isn't as dull as some people imagine, and is a helter-skelter ride compared with some other IT roles. Every IT department has people who toil away in shadowy corners doing quiet but mysterious tasks. When you ask them to explain what they do, you almost immediately want them to stop, but you hear enough to appreciate that these tasks are often absolutely vital to the smooth functioning of an IT organization. Compared with them, the DBAs are prima donnas. Here are a few nominations: Installation engineer - install all of the company's laptops and workstations, and software, deal with licensing, shipping and data entry.many organizations, especially those subject to tight regulation, would simply grind to a halt without their efforts. Localization engineer - Not quite software engineering, not quite translation, the job is to rebuild a product in a different language and make sure everything still works. QA Tester - firstly, I should say that the testers at Red Gate seem to me some of the most-fulfilled in the company. I refer here to the QA Tester whose job is more-or-less entirely to read a script, click some buttons and make sure the actual and expected values match. Configuration manager - for example, someone whose main job is to configure build environments so that devs can access their source code; assuredly necessary for the smooth functioning and productivity of the team, and hopefully well-paid. So what other sort of job in IT should one choose if the work of a DBA proves to be too exciting? Or are these roles secretly more exciting than many imagine? I invite you all to put forward your own suggestions. Cheers, Tony.

    Read the article

  • Should you ever re-estimate user stories?

    - by f1dave
    My current project is having a 'discussion' which is split down the middle- "this story is more complex than we originally thought, we should re-estimate" vs "you should never re-estimate as you only ever estimate up and never down". Can anyone shed some light on whether you ever should re-estimate? IMHO I'd imagine you could bring up an entirely new card for a new requirement or story, but going back and re-estimating on backlog items seems to skew the concept of relative sizing and will only ever 'inflate' your backlog.

    Read the article

  • Troubleshooting Windows Blue Screen Errors

    The so-called ‘Blue Screen of Death’ has inspired fear in the hearts of mere mortals, but Systems Administrators are expected be capable of casually beating back this sinister beast. So imagine Ben Lye’s distress when he discovered that many aspiring SysAdmins had no structured approach to tackling the root of the problem. Setting out to remedy the situation, Ben lays out a simple 3-step plan, and dispenses some good advice.

    Read the article

  • Can LittleBigPlanet2's engine be used for other ?

    - by Bill
    LittleBigPlanet2 just came out. I've worked with the original LBP level editor a bit and really enjoyed it. I've read that LBP2's featureset in the game is much richer; is it possible to use these advanced features to create different sorts of game other than just a regular platformer? I imagine that something along the lines of a Breakout clone would definitely be manageable, but I'm interested in hearing more about the capabilities of the platform.

    Read the article

  • Share and Deliver BI Publisher Reports in Multiple Languages

    - by kanichiro.nishida
    When you share your reports with someone who speak and read in different languages you want your reports to be shown in their language, right ? Well, translating reports with BI Publisher is not only easy but also reduces the maintenance cost a lot. Many of us in the BI Publisher product development team used to work in Globalization and Multi Lingual support, which enables Oracle products and applications to be used in many different languages and countries and territories.  And we have a lot of experience in this area. In fact, being a strategic reporting platform for Oracle EBS, PeopleSoft, JD Edwards, Siebel, and many other Oracle application products, our customers from all over the world are generating thousands of thousands of reports, including out-of-the-box pre-developed reports from Oracle and customer created or customized reports, in their own local language everyday as they operate and manage their business. Today, I’m going to talk about this very topic, how to translate my reports with BI Publisher 11G. Translation Grows, not the Numbers of the Reports Most of the reporting tools, regardless if it’s traditional or new, always take this translation on the back burner. They require their users to copy an original report and translate the whole thing. So when you want to support additional10 languages you will need to have 10 copies of the original. Imagine when you have 50 reports then you will end up having 500 reports (50 x 10) ! Now you need to maintain these 500 reports, whenever you need to make a change in a report you need to apply the same change to the other 10 reports. And as you imagine this is not only a nightmare for IT managements but not acceptable especially for the applications like Oracle EBS that supports over 30 languages. So first thing we did was, very simple, we separated the translation out of the report and marry it to the report only at the report generation. This means, regardless of how many languages you need to support you need to have only one report and translation files for the 10 languages, which would contain the translated letters and words. So let’s say you have 50 reports and need to support 10 languages for those reports you still have only 50 reports and each report now has 10 language translation files. Yes, translation is the one should grow as you add more languages to support, not the report itself! And second, we provide the translation files in XLIFF format, which is an international standard XML based format to exchange and maintain translation strings. So once you generate the XLIFF files for your reports with BI Publisher then you can work with any translation vendors in the world to make a mass translation or you can translate the XML files by yourself by manually updating the translatable strings presented in this text file. Lastly, we made it easier to manage the translation process starting from generating the XLIFF files to uploading the translated XLIFF files back to the BI Publisher server. You can generate, download, upload the XLIFF files from the BI Publisher’s Web interface with your browser and you can see the translated reports right away without needing to shutdown or restart your server. While the translated reports are displayed based on your language preference setting you can also specify a different language when you schedule or deliver the reports so that they can be generated in your customer’s preferred language. What Can I Translate? When it comes to translation there are three things. First, report content translation. When you receive a report you like to see the content like report title, section title, comments, annotation, table column header, and anything that are static and embedded in the report. in your preferred language. We call this Reports Content translation. Second, when you open a report online you might want to see not only the report content being translated but also the report UI, such as report name, parameter name, layout name, and anything that would help you to navigate around the reports, to be translated in your language. We call this Reports UI translation. And this separation of the Reports Content and Reports UI translation makes it very useful especially when you want to navigate through the reports in your preferred language UI but want to generate the reports in your customer’s preferred language. Imagine you are English native speaker and need to generate and send a report to your customers in China. You like to see the report name, parameter name in English so that you can comfortably navigate to the report and generate the report output, but like to see the report generated in Chinese so that the your customers in China can understand the report when they receive it. And lastly, you might want to see even the data presented in the report to be translated. For example, you might want to see product names in an Order Status report to be translated based on the report viewer’s language preference. We call this Reporting Data translation. Since this Reporting Data translation is maintained at the data source level such as Database tables along with the main data, you need to prepare the translation at the data source level first. Then, you want to make sure that your query is switched accordingly based on the language preference setting so that the translated data will be retrieved. How to Translate BI Publisher Reports? Now when it comes to ‘how to translate BI Publisher reports?’ the main focus here is about the translation for the Report Content and Report UI. And I just created this video to show you how to create and manage the translation with BI Publisher 11G. Please take a look at the clip below.   In today’s business world, customers and suppliers are from all over the world regardless of the size of the company or organization. Supporting multiple languages for your reports is no longer something ‘nice to have’, it’s mandatory. BI Publisher is designed to support multi lingual reports from the beginning without any extra hidden cost of license or configuration like other reporting tools such as Crystal Reports. You can support additional languages translation at any time with the very simple steps shown in the video above. Happy translation! Please share your translation experience with us! 

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >