Search Results

Search found 2140 results on 86 pages for 'iso 8859 1'.

Page 21/86 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • BasicAuthProvider in ServiceStack

    - by Per
    I've got an issue with the BasicAuthProvider in ServiceStack. POST-ing to the CredentialsAuthProvider (/auth/credentials) is working fine. The problem is that when GET-ing (in Chrome): http://foo:pwd@localhost:81/tag/string/list the following is the result Handler for Request not found: Request.HttpMethod: GET Request.HttpMethod: GET Request.PathInfo: /login Request.QueryString: System.Collections.Specialized.NameValueCollection Request.RawUrl: /login?redirect=http%3a%2f%2flocalhost%3a81%2ftag%2fstring%2flist which tells me that it redirected me to /login instead of serving the /tag/... request. Here's the entire code for my AppHost: public class AppHost : AppHostHttpListenerBase, IMessageSubscriber { private ITagProvider myTagProvider; private IMessageSender mySender; private const string UserName = "foo"; private const string Password = "pwd"; public AppHost( TagConfig config, IMessageSender sender ) : base( "BM App Host", typeof( AppHost ).Assembly ) { myTagProvider = new TagProvider( config ); mySender = sender; } public class CustomUserSession : AuthUserSession { public override void OnAuthenticated( IServiceBase authService, IAuthSession session, IOAuthTokens tokens, System.Collections.Generic.Dictionary<string, string> authInfo ) { authService.RequestContext.Get<IHttpRequest>().SaveSession( session ); } } public override void Configure( Funq.Container container ) { Plugins.Add( new MetadataFeature() ); container.Register<BeyondMeasure.WebAPI.Services.Tags.ITagProvider>( myTagProvider ); container.Register<IMessageSender>( mySender ); Plugins.Add( new AuthFeature( () => new CustomUserSession(), new AuthProvider[] { new CredentialsAuthProvider(), //HTML Form post of UserName/Password credentials new BasicAuthProvider(), //Sign-in with Basic Auth } ) ); container.Register<ICacheClient>( new MemoryCacheClient() ); var userRep = new InMemoryAuthRepository(); container.Register<IUserAuthRepository>( userRep ); string hash; string salt; new SaltedHash().GetHashAndSaltString( Password, out hash, out salt ); // Create test user userRep.CreateUserAuth( new UserAuth { Id = 1, DisplayName = "DisplayName", Email = "[email protected]", UserName = UserName, FirstName = "FirstName", LastName = "LastName", PasswordHash = hash, Salt = salt, }, Password ); } } Could someone please tell me what I'm doing wrong with either the SS configuration or how I am calling the service, i.e. why does it not accept the supplied user/pwd? Update1: Request/Response captured in Fiddler2when only BasicAuthProvider is used. No Auth header sent in the request, but also no Auth header in the response. GET /tag/string/AAA HTTP/1.1 Host: localhost:81 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA== HTTP/1.1 302 Found Location: /login?redirect=http%3a%2f%2flocalhost%3a81%2ftag%2fstring%2fAAA Server: Microsoft-HTTPAPI/2.0 X-Powered-By: ServiceStack/3,926 Win32NT/.NET Date: Sat, 10 Nov 2012 22:41:51 GMT Content-Length: 0 Update2 Request/Response with HtmlRedirect = null . SS now answers with the Auth header, which Chrome then issues a second request for and authentication succeeds GET http://localhost:81/tag/string/Abc HTTP/1.1 Host: localhost:81 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA== HTTP/1.1 401 Unauthorized Transfer-Encoding: chunked Server: Microsoft-HTTPAPI/2.0 X-Powered-By: ServiceStack/3,926 Win32NT/.NET WWW-Authenticate: basic realm="/auth/basic" Date: Sat, 10 Nov 2012 22:49:19 GMT 0 GET http://localhost:81/tag/string/Abc HTTP/1.1 Host: localhost:81 Connection: keep-alive Authorization: Basic Zm9vOnB3ZA== User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA==

    Read the article

  • How to copy child nodes to another xml document?

    - by Alex
    Below is my xml XML1 <?xml version="1.0" encoding="ISO-8859-1" ?> <CATALOG> <CD> <TITLE>1</TITLE> <ARTIST>Bob Dylan</ARTIST> <COUNTRY>USA</COUNTRY> <COMPANY>Columbia</COMPANY> <PRICE>10.90</PRICE> <YEAR>1985</YEAR> </CD> <CD> <TITLE>2</TITLE> <ARTIST>Bonnie Tyler</ARTIST> <COUNTRY>UK</COUNTRY> <COMPANY>CBS Records</COMPANY> <PRICE>9.90</PRICE> <YEAR>1988</YEAR> </CD> </CATALOG> XML2 <?xml version="1.0" encoding="ISO-8859-1" ?> <CATALOG> <CD> <TITLE>3</TITLE> <ARTIST>Dolly Parton</ARTIST> <COUNTRY>USA</COUNTRY> <COMPANY>RCA</COMPANY> <PRICE>9.90</PRICE> <YEAR>1982</YEAR> </CD> </CATALOG> i need output like this <?xml version="1.0" encoding="ISO-8859-1" ?> <CATALOG> <CD> <TITLE>1</TITLE> <ARTIST>Bob Dylan</ARTIST> <COUNTRY>USA</COUNTRY> <COMPANY>Columbia</COMPANY> <PRICE>10.90</PRICE> <YEAR>1985</YEAR> </CD> <CD> <TITLE>2</TITLE> <ARTIST>Bonnie Tyler</ARTIST> <COUNTRY>UK</COUNTRY> <COMPANY>CBS Records</COMPANY> <PRICE>9.90</PRICE> <YEAR>1988</YEAR> </CD> <CD> <TITLE>3</TITLE> <ARTIST>Dolly Parton</ARTIST> <COUNTRY>USA</COUNTRY> <COMPANY>RCA</COMPANY> <PRICE>9.90</PRICE> <YEAR>1982</YEAR> </CD> </CATALOG> How i write this in classic asp ?

    Read the article

  • servlet does not envoke when I call a from method.

    - by saloni
    My web.xml is like <web-app version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd"> <display-name>sample</display-name> <servlet> <servlet-name>Sampleclass</servlet-name> <servlet-class>sample.SampleClass</servlet-class> </servlet> <servlet-mapping> <servlet-name>Sampleclass</servlet-name> <url-pattern>/SampleClass</url-pattern> </servlet-mapping> <welcome-file-list> <welcome-file>/page/form.jsp</welcome-file> </welcome-file-list> </web-app> and form.jsp <%@ page language="java" contentType="text/html; charset=ISO-8859-1" pageEncoding="ISO-8859-1"%> <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> <title>Insert title here</title> </head> <body> <h1>A simple web application</h1> <form method="POST" name="Sampleclass" action="SampleClass"> <label for="name">Enter your name </label> <input type="text" id="name" name="name"/><br><br> <input type="submit" value="Submit Form"/> <input type="reset" value="Reset Form"/> </form> </body> </html> and SampleClass.java is public class SampleClass extends HttpServlet { public void init(ServletConfig config) throws ServletException { super.init(config); } protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String name = request.getParameter("name"); String age = request.getParameter("age"); PrintWriter out = response.getWriter(); out.write("<html>Hello Your name is "+name +", and your age is "+age+"</html>"); } public void destroy() { } } but I am getting error when I entered to submit button of form.jsp and error is type Status report message HTTP method POST is not supported by this URL description The specified HTTP method is not allowed for the requested resource (HTTP method POST is not supported by this URL). I am not understanding that what is the problem exactly ? Please help..

    Read the article

  • Trouble with client side validation using Struts 2. Xml based validation rules not recognized.

    - by Kartik
    Hi, This is my first post to ask a question on stack overflow and my issue is that when I don't see a client side validation error message when I don't enter any values for that field even when it is configured as required. The input page is reloaded but no error message is seen. I am not sure what I am doing wrong. Any advice would be greatly appreciated. My scenario is given below: I have a simple form where I have a pull down menu called selection criterion. A value must be selected. If a value is not selected, then the page should reload with configured error message. My input form action_item_search.jsp is given below: <%@ taglib prefix="s" uri="/struts-tags" %> <%@ page language="java" contentType="text/html; charset=ISO-8859-1" pageEncoding="ISO-8859-1"%> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1" /> <title>Action Item Search</title> </head> <body> <s:actionerror/> <s:fielderror /> <s:form action="action_item_search" validate="true"> <s:select label="Search Criterion" name="searchCriterion" list="#{'': 'Select One', 'creatorName':'creator name', assignedTo':'assigned to'}" required="true" /> <s:submit name="search" value="Search"></s:submit> </s:form> </body> I have add validators.xml in my WEB-INF/classes directory of exploded war file as given below: <!DOCTYPE validators PUBLIC "-//OpenSymphony Group//XWork Validator Config 1.0//EN" "http://www.opensymphony.com/xwork/xwork-validator-config-1.0.dtd"> ActionItemTrackingAction-findByCriteria-validation.xml is given below: <!DOCTYPE validators PUBLIC "-//OpenSymphony Group//XWork Validator 1.0.2//EN" "http://www.opensymphony.com/xwork/xwork-validator-1.0.2.dtd"> You must enter a search criterion. My struts mapping xml: <struts> <constant name="struts.enable.DynamicMethodInvocation" value="false" /> <constant name="struts.devMode" value="true" /> <!-- <include file="example.xml"/> --> <package name="action-item" extends="struts-default"> <action name = "action_item_search_input"> <result name = "success">/action-item-search.jsp</result> </action> <action name="action_item_search" class="gov.nasa.spacebook.ActionItemTrackingAction" method="fetchByCriteria"> <result name = "success">/action-item-result.jsp</result> <result name = "input">/action-item-search.jsp</result> <result name = "error">/action-item-search.jsp</result> </action> </package> My action class public class ActionItemTrackingAction extends ActionSupport { private List<ActionItem> actionItems; public List<ActionItemTracking> getActionItems() { return actionItems; } public void setActionItems(List<ActionItemTracking> actionItems) { this.actionItems = actionItems; } private String searchCriterion; public String getSearchCriterion() { return searchCriterion; } public void setSearchCriterion(final String criterion) { this.searchCriterion = criterion; } public String fetchByCriteria() throws Exception { final ActionItemTrackingService service = new ActionItemTrackingService(); this.actionItems = service.getByField(this.actionItem); return super.execute(); } }

    Read the article

  • Boot From a USB Drive Even if your BIOS Won’t Let You

    - by Trevor Bekolay
    You’ve always got a trusty bootable USB flash drive with you to solve computer problems, but what if a PC’s BIOS won’t let you boot from USB? We’ll show you how to make a CD or floppy disk that will let you boot from your USB drive. This boot menu, like many created before USB drives became cheap and commonplace, does not include an option to boot from a USB drive. A piece of freeware called PLoP Boot Manager solves this problem, offering an image that can burned to a CD or put on a floppy disk, and enables you to boot to a variety of devices, including USB drives. Put PLoP on a CD PLoP comes as a zip file, which includes a variety of files. To put PLoP on a CD, you will need either plpbt.iso or plpbtnoemul.iso from that zip file. Either disc image should work on most computers, though if in doubt plpbtnoemul.iso should work “everywhere,” according to the readme included with PLoP Boot Manager. Burn plpbtnoemul.iso or plpbt.iso to a CD and then skip to the “booting PLoP Boot Manager” section. Put PLoP on a Floppy Disk If your computer is old enough to still have a floppy drive, then you will need to put the contents of the plpbt.img image file found in PLoP’s zip file on a floppy disk. To do this, we’ll use a freeware utility called RawWrite for Windows. We aren’t fortunate enough to have a floppy drive installed, but if you do it should be listed in the Floppy drive drop-down box. Select your floppy drive, then click on the “…” button and browse to plpbt.img. Press the Write button to write PLoP boot manager to your floppy disk. Booting PLoP Boot Manager To boot PLoP, you will need to have your CD or floppy drive boot with higher precedence than your hard drive. In many cases, especially with floppy disks, this is done by default. If the CD or floppy drive is not set to boot first, then you will need to access your BIOS’s boot menu, or the setup menu. The exact steps to do this vary depending on your BIOS – to get a detailed description of the process, search for your motherboard’s manual (or your laptop’s manual if you’re working with a laptop). In general, however, as the computer boots up, some important keyboard strokes are noted somewhere prominent on the screen. In our case, they are at the bottom of the screen. Press Escape to bring up the Boot Menu. Previously, we burned a CD with PLoP Boot Manager on it, so we will select the CD-ROM Drive option and hit Enter. If your BIOS does not have a Boot Menu, then you will need to access the Setup menu and change the boot order to give the floppy disk or CD-ROM Drive higher precedence than the hard drive. Usually this setting is found in the “Boot” or “Advanced” section of the Setup menu. If done correctly, PLoP Boot Manager will load up, giving a number of boot options. Highlight USB and press Enter. PLoP begins loading from the USB drive. Despite our BIOS not having the option, we’re now booting using the USB drive, which in our case holds an Ubuntu Live CD! This is a pretty geeky way to get your PC to boot from a USB…provided your computer still has a floppy drive. Of course if your BIOS won’t boot from a USB it probably has one…or you really need to update it. Download PLoP Boot Manager Download RawWrite for Windows Similar Articles Productive Geek Tips Create a Bootable Ubuntu 9.10 USB Flash DriveReinstall Ubuntu Grub Bootloader After Windows Wipes it OutCreate a Bootable Ubuntu USB Flash Drive the Easy WayBuilding a New Computer – Part 3: Setting it UpInstall Windows XP on Your Pre-Installed Windows Vista Computer TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Office 2010 reviewed in depth by Ed Bott FoxClocks adds World Times in your Statusbar (Firefox) Have Fun Editing Photo Editing with Citrify Outlook Connector Upgrade Error Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7

    Read the article

  • Converting a PV vm back into an HVM vm

    - by wim.coekaerts
    I have been doing some Oracle VM benchmark stuff in the last week or 2 in my off hours and yesterday I wanted to convert one of my VMs that was based on a paravirt kernel into a vm that just boots as a regular hardware virt VM with a standard x86-64 kernel. It took me a little while to figure out the fastest way so now that I have it pretty much down I wanted to share the steps. A PV kernel uses pygrub and a paravirt kernel image that lives on the vm image virtual disk. since this disk image does not have to be bootable it doesn't contain a boot sector and if you just restart the VM in hvm mode the virtual bios will just not do much as it can't start the boot process from disk The first thing I do is make a backup of my vm.cfg file :-) and then edit it as follows : the original file contains : bootloader = '/usr/bin/pygrub' I replace that with : acpi = 1 apic = 1 builder = 'hvm' device_model = '/usr/lib/xen/bin/qemu-dm' kernel = '/usr/lib/xen/boot/hvmloader' then changing the disk files. I change my xvd disks to hd disks and I copy over the iso image of my instal lDVD. In the case of my VM template it was based on OL5U4 So I downloaded Enterprise-R5-U4-Server-x86_64-dvd.iso and added it as a cd device. disk = ['file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/System.img,xvda,w', 'file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/Oracle11202RAC_x86_64-xvdb.img,xvdb,w', ] to disk = ['file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/System.img,hda,w', 'file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/Oracle11202RAC_x86_64-xvdb.img,hdb,w', 'file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/Enterprise-R5-U4-Server-x86_64-dvd.iso, hdc:cdrom,r', ] boot='d' for the network devices (vifs) I change : vif = ['bridge=xenbr2,type=netfront'] to vif = ['bridge=xenbr2,type=ioemu'] That should do it. Next, inside the VM, I copy over the regular kernel rpm that I want to end up running in hvm mode. In this example case it was : kernel-2.6.18-164.0.0.0.1.el5.x8664.rpm. I will use that later on in the process. I put this kernel simply in /root At this point I just start the vm with xm create vm.cfg and start my vnc console to the vm console. Oracle Linux will boot from the iso image, I just go through the install steps and click on UPgrade existing (not re-install). Because the VM is the same as the ISO the install won't actually do anything and it will run through instantly. When the "Reboot" button pops up, don't reboot. Switch to the command prompt console. hi alt-f2 to go to the shell prompt. Now it's easy : umount /mnt/sysimage/boot cd /mnt/sysimage chroot . mount /dev/hda1 (if that was your /boot partition) export PATH=/sbin:$PATH (just to clean that up) edit /etc/modprobe.conf and comment out the xen modules (just put a # in front) Install grub. if your /boot is hda1 then that is (hd0,0) $ grub root (hd0,0) setup (hd0) exit grub now you have a good bootsector, grub installed and you have your grub.conf file Install the new kernel cd root (this is your old /root in your pv image) rpm -ivh remove (or comment out) boot='d' in your vm.cfg restart the VM and you should be good to go, regular grub should start and load your environment. Caveats : this assumes you used labels for your filesystems. if /etc/fstab were to have devices listed then you would have to rename these device before rebooting as well. If you had a /dev/xvda disk then this would be /dev/hda or /dev/sda. All in all it is a relatively short and simple process.

    Read the article

  • Converting a PV vm back into an HVM vm

    - by wim.coekaerts
    I have been doing some Oracle VM benchmark stuff in the last week or 2 in my off hours and yesterday I wanted to convert one of my VMs that was based on a paravirt kernel into a vm that just boots as a regular hardware virt VM with a standard x86-64 kernel. It took me a little while to figure out the fastest way so now that I have it pretty much down I wanted to share the steps. A PV kernel uses pygrub and a paravirt kernel image that lives on the vm image virtual disk. since this disk image does not have to be bootable it doesn't contain a boot sector and if you just restart the VM in hvm mode the virtual bios will just not do much as it can't start the boot process from disk The first thing I do is make a backup of my vm.cfg file :-) and then edit it as follows : the original file contains : bootloader = '/usr/bin/pygrub' I replace that with : acpi = 1 apic = 1 builder = 'hvm' device_model = '/usr/lib/xen/bin/qemu-dm' kernel = '/usr/lib/xen/boot/hvmloader' then changing the disk files. I change my xvd disks to hd disks and I copy over the iso image of my instal lDVD. In the case of my VM template it was based on OL5U4 So I downloaded Enterprise-R5-U4-Server-x86_64-dvd.iso and added it as a cd device. disk = ['file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/System.img,xvda,w', 'file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/Oracle11202RAC_x86_64-xvdb.img,xvdb,w', ] to disk = ['file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/System.img,hda,w', 'file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/Oracle11202RAC_x86_64-xvdb.img,hdb,w', 'file:/ovs/OVM_EL5U4_X86_64_11202RAC_PVM/Enterprise-R5-U4-Server-x86_64-dvd.iso, hdc:cdrom,r', ] boot='d' for the network devices (vifs) I change : vif = ['bridge=xenbr2,type=netfront'] to vif = ['bridge=xenbr2,type=ioemu'] That should do it. Next, inside the VM, I copy over the regular kernel rpm that I want to end up running in hvm mode. In this example case it was : kernel-2.6.18-164.0.0.0.1.el5.x8664.rpm. I will use that later on in the process. I put this kernel simply in /root At this point I just start the vm with xm create vm.cfg and start my vnc console to the vm console. Oracle Linux will boot from the iso image, I just go through the install steps and click on UPgrade existing (not re-install). Because the VM is the same as the ISO the install won't actually do anything and it will run through instantly. When the "Reboot" button pops up, don't reboot. Switch to the command prompt console. hi alt-f2 to go to the shell prompt. Now it's easy : umount /mnt/sysimage/boot cd /mnt/sysimage chroot . mount /dev/hda1 (if that was your /boot partition) export PATH=/sbin:$PATH (just to clean that up) edit /etc/modprobe.conf and comment out the xen modules (just put a # in front) Install grub. if your /boot is hda1 then that is (hd0,0) $ grub root (hd0,0) setup (hd0) exit grub now you have a good bootsector, grub installed and you have your grub.conf file Install the new kernel cd root (this is your old /root in your pv image) rpm -ivh remove (or comment out) boot='d' in your vm.cfg restart the VM and you should be good to go, regular grub should start and load your environment. Caveats : this assumes you used labels for your filesystems. if /etc/fstab were to have devices listed then you would have to rename these device before rebooting as well. If you had a /dev/xvda disk then this would be /dev/hda or /dev/sda. All in all it is a relatively short and simple process.

    Read the article

  • Booting the liveCD/USB in EFI mode fails on Samsung Tablet XE700T1A

    - by F.L.
    My tablet is Samsung Series 7 Slate (XE700T1A-A02FR (French Language)). It operates an Intel Sandy Bridge architecture. The main issue about this tablet is that it ships with an installed Windows 7 in (U)EFI mode (GPT partition table, etc.), so I'd like to get an EFI dual boot with Ubuntu. But it seems I can't boot on the liveCD in EFI mode. It starts loading (up to initrd), but I then get a blank (black) screen. I've tried the nomodeset kernel option (as well as removing quiet and splash) with no luck. [2012-09-27] I have used the Ubuntu 12.04.1 Desktop ISO (I have read somewhere that it is the only one that can boot in EFI mode). I'd say this has something to do with UEFI since the LiveCD boots in bios mode but not in efi mode. Besides, I am not sure my boot info will help, since I can't boot the LiveCD in EFI mode. As a result I can't install ubuntu in EFI mode. So it would be the boot info from the liveCD boot in bios mode. This happens on a ubuntu-12.04.1-desktop-amd64 iso used on a LiveUSB. Live USB was created by dd'ing the iso onto the full disk device (i.e. /dev/sdx no number) of the Flash drive. I have also tried copying the LiveCD files on a primary GPT partition, but with no luck, I just get the grub shell, no menu, no install option. [2012-09-28] I tried today a flash drive created with Ubuntu's Startup Disk Creator and the alternate 12.04.1 64 bit ISO. I get a grub menu in text mode (which meens it did start in efi mode) with install options / test options. But when I start any of these, I simply get a black screen (no cursor, neither mouse nor text-mode cursor). I tried removing the 'quiet' option and adding nomodeset and acpi=off, but it didn't do any good. So this is the same result as for the LiveCD. [2012-10-01] I have tried with a version of the secure remix version via usb-creator-gtk. The boot on the USB key has the same symptoms. Boot in EFI mode is impossible (I have menu but whatever entry I choose, I get the blank screen problem). The boot in BIOS mode works, I did the install. Then I used boot-repair to try installing grub-efi and get a system that would boot in efi mode. But I can't boot this system, because the EFI firmware doesn't seem to detect that sda contains a valid efi partition. Here is the resulting boot-info Boot info 1253554 [2012-10-01] Today, I have reinstalled the pre-shipped version of windows 7, and then installed ubuntu from a secure-remix iso dumped on USB flash drive vie usb-creator-gtk booted in BIOS mode. When install ended, I said "continue testing" then I used boot-repair to try get the bootloader installed. Now, when I boot the tablet, I get the grub menu, it can chainload windows 7 flawlessly. But when I try to start one of the ubuntu options I get the same old blank screen. Here is the new boot-info: Boot info 1253927 [2012-10-01] I tried installing the 3.3 kernel by chrooting a live usb boot (secure remix again) into the installed system. Same symptoms. I feel the key to this is that the device's efi firmware (which is EFI v2.0) would expose the graphics hardware in a way that prevents the kernel to initialize it, and thus prevents it from booting (the kernel stops all drive access just after the screen turns kind of very dark purple). Here is some info on the UEFI firmware as given by rEFInd: EFI revision: 2.00 Platform: x86_64 (64 bit) Firmware: American Megatrends 4.635 Screen Output: Graphics Output (UEFI), 800x600 [2012-10-08] This week end I tried loading the kernel with elilo. Eventhough I didn't have more luck on booting the kernel, elilo gives more info when loading the kernel. I think the next step is trying to load a kernel with EFI stub directly.

    Read the article

  • Uncompress GZIPed HTTP Response in Java

    - by bill0ute
    Hi, I'm trying to uncompress a GZIPed HTTP Response by using GZIPInputStream. However I always have the same exception when I try to read the stream : java.util.zip.ZipException: invalid bit length repeat My HTTP Request Header: GET www.myurl.com HTTP/1.0\r\n User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6\r\n Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\n Accept-Language: fr,fr-fr;q=0.8,en-us;q=0.5,en;q=0.3\r\n Accept-Encoding: gzip,deflate\r\n Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7\r\n Keep-Alive: 115\r\n Connection: keep-alive\r\n X-Requested-With: XMLHttpRequest\r\n Cookie: Some Cookies\r\n\r\n At the end of the HTTP Response header, I get path=/Content-Encoding: gzip, followed by the gziped response. I tried 2 similars codes to uncompress : GZIPInputStream gzip = new GZIPInputStream (new ByteArrayInputStream (tBytes)); StringBuffer szBuffer = new StringBuffer (); byte tByte [] = new byte [1024]; while (true) { int iLength = gzip.read (tByte, 0, 1024); // <-- Error comes here if (iLength < 0) break; szBuffer.append (new String (tByte, 0, iLength)); } And this one that I get on this forum : InputStream gzipStream = new GZIPInputStream (new ByteArrayInputStream (tBytes)); Reader decoder = new InputStreamReader (gzipStream, "UTF-8");//<- I tried ISO-8859-1 and get the same exception BufferedReader buffered = new BufferedReader (decoder); I guess this is an encoding error. Best regards, bill0ute

    Read the article

  • CSG operations on implicit surfaces with marching cubes [SOLVED]

    - by Mads Elvheim
    I render isosurfaces with marching cubes, (or perhaps marching squares as this is 2D) and I want to do set operations like set difference, intersection and union. I thought this was easy to implement, by simply choosing between two vertex scalars from two different implicit surfaces, but it is not. For my initial testing, I tried with two spheres circles, and the set operation difference. i.e A - B. One circle is moving and the other one is stationary. Here's the approach I tried when picking vertex scalars and when classifying corner vertices as inside or outside. The code is written in C++. OpenGL is used for rendering, but that's not important. Normal rendering without any CSG operations does give the expected result. void march(const vec2& cmin, //min x and y for the grid cell const vec2& cmax, //max x and y for the grid cell std::vector<vec2>& tri, float iso, float (*cmp1)(const vec2&), //distance from stationary circle float (*cmp2)(const vec2&) //distance from moving circle ) { unsigned int squareindex = 0; float scalar[4]; vec2 verts[8]; /* initial setup of the grid cell */ verts[0] = vec2(cmax.x, cmax.y); verts[2] = vec2(cmin.x, cmax.y); verts[4] = vec2(cmin.x, cmin.y); verts[6] = vec2(cmax.x, cmin.y); float s1,s2; /********************************** ********For-loop of interest****** *******Set difference between **** *******two implicit surfaces****** **********************************/ for(int i=0,j=0; i<4; ++i, j+=2){ s1 = cmp1(verts[j]); s2 = cmp2(verts[j]); if((s1 < iso)){ //if inside circle1 if((s2 < iso)){ //if inside circle2 scalar[i] = s2; //then set the scalar to the moving circle } else { scalar[i] = s1; //only inside circle1 squareindex |= (1<<i); //mark as inside } } else { scalar[i] = s1; //inside neither circle } } if(squareindex == 0) return; /* Usual interpolation between edge points to compute the new intersection points */ verts[1] = mix(iso, verts[0], verts[2], scalar[0], scalar[1]); verts[3] = mix(iso, verts[2], verts[4], scalar[1], scalar[2]); verts[5] = mix(iso, verts[4], verts[6], scalar[2], scalar[3]); verts[7] = mix(iso, verts[6], verts[0], scalar[3], scalar[0]); for(int i=0; i<10; ++i){ //10 = maxmimum 3 triangles, + one end token int index = triTable[squareindex][i]; //look up our indices for triangulation if(index == -1) break; tri.push_back(verts[index]); } } This gives me weird jaggies: It looks like the CSG operation is done without interpolation. It just "discards" the whole triangle. Do I need to interpolate in some other way, or combine the vertex scalar values? I'd love some help with this. A full testcase can be downloaded HERE EDIT: Basically, my implementation of marching squares works fine. It is my scalar field which is broken, and I wonder what the correct way would look like. Preferably I'm looking for a general approach to implement the three set operations I discussed above, for the usual primitives (circle, rectangle/square, plane)

    Read the article

  • empty response body in ajax (or 206 Partial Content)

    - by Nikita Rybak
    Hi guys, I'm feeling completely stupid because I've spent two hours solving task which should be very simple and which I solved many times before. But now I'm not even sure in which direction to dig. I fail to fetch static content using ajax from local servers (Apache and Mongrel). I get responses 200 and 206 (depending on the server), empty response text (although Content-Length header is always correct), firebug shows request in red. Javascript is very generic, I'm getting same results even here: http://www.w3schools.com/ajax/tryit.asp?filename=tryajax_first (just change document location to 'http://localhost:3000/whatever') So, it's probably not the cause. Well, now I'm out of ideas. I can also post http headers, if it'll help. Thanks! Response Headers Connection close Date Sat, 01 May 2010 21:05:23 GMT Last-Modified Sun, 18 Apr 2010 19:33:26 GMT Content-Type text/html Content-Length 7466 Request Headers Host localhost:3000 User-Agent Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Referer http://www.w3schools.com/ajax/tryit_view.asp Origin http://www.w3schools.com Response Headers Date Sat, 01 May 2010 21:54:59 GMT Server Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_jk/1.2.28 Etag "3d5cbdb-fb4-4819c460d4a40" Accept-Ranges bytes Content-Length 4020 Cache-Control max-age=7200, public, proxy-revalidate Expires Sat, 01 May 2010 23:54:59 GMT Content-Range bytes 0-4019/4020 Keep-Alive timeout=5, max=100 Connection Keep-Alive Content-Type application/javascript Request Headers Host localhost User-Agent Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Origin null

    Read the article

  • jQuery AJAX Character Encoding Problem

    - by Salty
    Hi everyone, I'm currently coding a French website. There's a schedule page, where a link on the side can be used to load another day's schedule. http://aquate.us/film/horaire.html (At the moment, only the links for November 13th and November 14th work) Here's the JS I'm using to do this: <script type="text/javascript"> function load(y) { $.get(y,function(d) { $("#replace").html(d); mod(); }); } function mod() { $("#dates a").click(function() { y = $(this).attr("href"); load(y); return false; }); } mod(); </script> The actual AJAX works like a charm. My problem lies with the response to the request. Because it is a French website, there are many accented letters. I'm using the ISO-8859-15 charset for that very reason. However, in the response to my AJAX request, the accents are becoming ?'s because the character encoding seems to be changed back to UTF-8. How do I avoid this? I've already tried adding some PHP at the top of the requested documents to set the character set: <?php header('Content-Type: text/html; charset=ISO-8859-15'); ?> But that doesn't seem to work either. Any thoughts? Also, while any of you are looking here...why does the rightmost column seem to become smaller when a new page is loaded, causing the table to distort and each <li> within the <td> to wrap to the next line? Cheers

    Read the article

  • Forms authentication, ASP.NET MVC and WCF RESTful service

    - by J F
    One test webserver, with the following applications service.ganymedes.com:8008 - WCF RESTful service, basically the FormsAuth sample from WCF Starter Kit Preview 2 mvc.ganymedes.com:8008 - ASP.NET MVC 2.0 application web.config for service.ganymedes.com: <authentication mode="Forms"> <forms loginUrl="~/login.aspx" timeout="2880" domain="ganymedes.com" name="GANYMEDES_COOKIE" path="/" /> </authentication> web.config for mvc.ganymedes.com: <authentication mode="Forms"> <forms loginUrl="~/Account/LogOn" timeout="2880" domain="ganymedes.com" name="GANYMEDES_COOKIE" path="/" /> </authentication> Trying my darndest, a GET (or POST for that matter) via jQuery's $.ajax or getJson does not send my cookie (according to Firebug), so I get HTTP 302 returned from the WCF service: Request Headers Host service.ganymedes.com:8008 User-Agent Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729) Accept application/json, text/javascript, */* Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 300 Connection keep-alive Referer http://mvc.ganymedes.com:8008/Test Origin http://mvc.ganymedes.com:8008 It's sent when mucking about on the MVC site though: Request Headers Host mvc.ganymedes.com:8008 User-Agent Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729) Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 300 Connection keep-alive Referer http://mvc.ganymedes.com:8008/Test Cookie GANYMEDES_COOKIE=0106A4A666C8C615FBFA9811E9A6C5219C277D625C04E54122D881A601CD0E00C10AF481CB21FAED544FAF4E9B50C59CDE2385644BBF01DDD4F211FE7EE8FAC2; GANYMEDES_COOKIE=D6569887B7C5B67EFE09079DD59A07A98311D7879817C382D79947AE62B5508008C2B2D2112DCFCE5B8D4C61D45A109E61BBA637FD30315C2D8353E8DDFD4309 I also put the exact same settings in both applications' web.config files (self-generated validationKey and decryptionKey).

    Read the article

  • Charset and POST request

    - by jriff
    Hi All! I have a Rails 2.3.5 application that is working fine with UTF-8 and international characters. Now I have made some integration to a payment gateway where I POST some data, wait a while and get a POST back. The problem is that when I get that post back the international characters are broken. Instead of "sørensen" I get: "sørensen". If I do an iconv -fISO-8859-1 -tUTF8 it gets correctly converted to the former (I do that from a OS X command prompt). I have examined the POST request with logger.info(request.headers.inspect) in my controller and I can see that no charset parameter is given. As far as I can see the POST from the gateway must be UTF8 since one character (ø) gets translated to two (ø). So why does Rails think that the POST is ISO-8859-1? I know that one solution is to simply convert the params-hash with Iconv in the controller but I would like to know what is happening. Thanks in advance. Regards, Jacob

    Read the article

  • How to send HTTP get method with headers using CURL

    - by mithunmo
    Hello , I need to send GET Request method with the below headers . I am getting the following capture from HTTP live headers ***http://172.20.22.26/ GET / HTTP/1.1 Host: 172.20.22.26 User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.1) Gecko/2008070208 Firefox/3.0.1 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Connection: keep-alive Authorization: Basic bWl0aHVuOm1pdGh1bg== HTTP/1.x 200 OK Date: Thu, 01 Jan 2009 00:29:20 GMT Server: HTTPsrv Connection: Keep-Alive Keep-Alive: timeout=30, max=100 Transfer-Encoding: chunked Content-Type: text/html ----------------------------**------------------------------* I am using the following program . It is not working . Please let me know where I am going wrong. <?php $credentials = "mithun:mithun"; $url = "http://172.20.22.26"; $headers = array( "GET /HTTP/1.1", "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.1) Gecko/2008070208 Firefox/3.0.1", "Content-type: text/xml;charset=\"utf-8\"", "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8", "Accept-Language: en-us,en;q=0.5", "Accept-Encoding: gzip,deflate", "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Keep-Alive: 300", "Connection: keep-alive", "Authorization: Basic " . base64_encode($credentials)); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,$url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_TIMEOUT, 60); curl_setopt($ch, CURLOPT_HTTPHEADER, $headers); //curl_setopt($ch, CURLOPT_USERAGENT, $defined_vars['HTTP_USER_AGENT']); $data = curl_exec($ch); if (curl_errno($ch)) { print "Error: " . curl_error($ch); } else { // Show me the result var_dump($data); curl_close($ch); }?>

    Read the article

  • How to test an application for correct encoding (e.g. UTF-8)

    - by Olaf
    Encoding issues are among the one topic that have bitten me most often during development. Every platform insists on its own encoding, most likely some non-UTF-8 defaults are in the game. (I'm usually working on Linux, defaulting to UTF-8, my colleagues mostly work on german Windows, defaulting to ISO-8859-1 or some similar windows codepage) I believe, that UTF-8 is a suitable standard for developing an i18nable application. However, in my experience encoding bugs are usually discovered late (even though I'm located in Germany and we have some special characters that along with ISO-8859-1 provide some detectable differences). I believe that those developers with a completely non-ASCII character set (or those that know a language that uses such a character set) are getting a head start in providing test data. But there must be a way to ease this for the rest of us as well. What [technique|tool|incentive] are people here using? How do you get your co-developers to care for these issues? How do you test for compliance? Are those tests conducted manually or automatically? Adding one possible answer upfront: I've recently discovered fliptitle.com (they are providing an easy way to get weird characters written "u?op ?pisdn" *) and I'm planning on using them to provide easily verifiable UTF-8 character strings (as most of the characters used there are at some weird binary encoding position) but there surely must be more systematic tests, patterns or techniques for ensuring UTF-8 compatibility/usage. Note: Even though there's an accepted answer, I'd like to know of more techniques and patterns if there are some. Please add more answers if you have more ideas. And it has not been easy choosing only one answer for acceptance. I've chosen the regexp answer for the least expected angle to tackle the problem although there would be reasons to choose other answers as well. Too bad only one answer can be accepted. Thank you for your input. *) that's "upside down" written "upside down" for those that cannot see those characters due to font problems

    Read the article

  • CSG operations on implicit surfaces with marching cubes

    - by Mads Elvheim
    I render isosurfaces with marching cubes, (or perhaps marching squares as this is 2D) and I want to do set operations like set difference, intersection and union. I thought this was easy to implement, by simply choosing between two vertex scalars from two different implicit surfaces, but it is not. For my initial testing, I tried with two spheres, and the set operation difference. i.e A - B. One sphere is moving and the other one is stationary. Here's the approach I tried when picking vertex scalars and when classifying corner vertices as inside or outside. The code is written in C++. OpenGL is used for rendering, but that's not important. Normal rendering without any CSG operations does give the expected result. void march(const vec2& cmin, //min x and y for the grid cell const vec2& cmax, //max x and y for the grid cell std::vector<vec2>& tri, float iso, float (*cmp1)(const vec2&), //distance from stationary sphere float (*cmp2)(const vec2&) //distance from moving sphere ) { unsigned int squareindex = 0; float scalar[4]; vec2 verts[8]; /* initial setup of the grid cell */ verts[0] = vec2(cmax.x, cmax.y); verts[2] = vec2(cmin.x, cmax.y); verts[4] = vec2(cmin.x, cmin.y); verts[6] = vec2(cmax.x, cmin.y); float s1,s2; /********************************** ********For-loop of interest****** *******Set difference between **** *******two implicit surfaces****** **********************************/ for(int i=0,j=0; i<4; ++i, j+=2){ s1 = cmp1(verts[j]); s2 = cmp2(verts[j]); if((s1 < iso)){ //if inside sphere1 if((s2 < iso)){ //if inside sphere2 scalar[i] = s2; //then set the scalar to the moving sphere } else { scalar[i] = s1; //only inside sphere1 squareindex |= (1<<i); //mark as inside } } else { scalar[i] = s1; //inside neither sphere } } if(squareindex == 0) return; /* Usual interpolation between edge points to compute the new intersection points */ verts[1] = mix(iso, verts[0], verts[2], scalar[0], scalar[1]); verts[3] = mix(iso, verts[2], verts[4], scalar[1], scalar[2]); verts[5] = mix(iso, verts[4], verts[6], scalar[2], scalar[3]); verts[7] = mix(iso, verts[6], verts[0], scalar[3], scalar[0]); for(int i=0; i<10; ++i){ //10 = maxmimum 3 triangles, + one end token int index = triTable[squareindex][i]; //look up our indices for triangulation if(index == -1) break; tri.push_back(verts[index]); } } This gives me weird jaggies: It looks like the CSG operation is done without interpolation. It just "discards" the whole triangle. Do I need to interpolate in some other way, or combine the vertex scalar values? I'd love some help with this. A full testcase can be downloaded HERE

    Read the article

  • RSS feed generated by SharePoint has a stylesheet tag and how to remove that

    - by iHeartDucks
    The feed which SharePoint Generates is here (I copied it to pastie because I thought it would be clear there) However, the xml file comes with a style sheet tag. How do I remove that? Does SharePoint always generate that? Due to the presence of that tag, I am unable to apply another style sheet of my own using the XML WebPart. EDIT: I don't think the issue is related to the style sheet. If I copy the xml and paste it in the "Xml Editor" of the Web Part everything works just fine. If I provide the URL, that is when I do not see any data. This is my XSL file <?xml version="1.0" encoding="ISO-8859-1"?> <xsl:stylesheet version="1.0" exclude-result-prefixes="x d ddwrt xsl msxsl" xmlns:x="http://www.w3.org/2001/XMLSchema" xmlns:d="http://schemas.microsoft.com/sharepoint/dsp" xmlns:ddwrt="http://schemas.microsoft.com/WebParts/v2/DataView/runtime" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt"> <xsl:output method="html" version="1.0" encoding="iso-8859-1" indent="yes"/> <xsl:template match="/"> <xsl:value-of select="count(rss)" /> <xsl:value-of select="count(rss/channel)" /> <xsl:value-of select="count(rss/channel/item)" /> <xsl:for-each select="rss/channel/item"> <xsl:value-of select="title" /> </xsl:for-each> </xsl:template> </xsl:stylesheet> Pastie link

    Read the article

  • UnsupportedEncodingException thrown when using Resin and Grails

    - by knorv
    I've encountered a strange problem in a Grails webapp running under Grails: java.io.UnsupportedEncodingException is thrown quite frequently due to various unknown encoding strings (such as "ISO8859_10", "ISO-8859-10"), and the strange thing is that this is done entirely within the Resin and Grails code. That is - no custom code is involved when the exception is thrown. I'm not sure if it is Grails or the servlet container's code that should handle the exception. But I'd assume that the exception should be handled somewhere and not bubble up all the way to stderr. This is the exception in full: java.io.UnsupportedEncodingException: ISO-8859-10 at com.caucho.vfs.i18n.JDKWriter$OutputStreamEncodingWriter.<init>(JDKWriter.java:112) at com.caucho.vfs.i18n.JDKWriter.create(JDKWriter.java:79) at com.caucho.vfs.Encoding.getWriteEncoding(Encoding.java:231) at com.caucho.server.connection.ToByteResponseStream.setEncoding(ToByteResponseStream.java:137) at com.caucho.server.connection.AbstractHttpResponse.setLocale(AbstractHttpResponse.java:1683) at com.caucho.server.connection.HttpServletResponseImpl.setLocale(HttpServletResponseImpl.java: 115) at javax.servlet.ServletResponseWrapper.setLocale(ServletResponseWrapper.java:139) at javax.servlet.ServletResponseWrapper.setLocale(ServletResponseWrapper.java:139) at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1035) at org.codehaus.groovy.grails.web.servlet.GrailsDispatcherServlet.doDispatch(GrailsDispatcherServlet.java:290) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:716) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:647) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:552) at javax.servlet.http.HttpServlet.service(HttpServlet.java:114) My questions: Should the exception be handled? If so, is it the responsibility of the servlet container (Resin) or the web framework (Grails)? How would you go about solving this? (I'd rather not having the exception log cluttered with exceptions that I can do nothing about.)

    Read the article

  • latin1/unicode conversion problem with ajax request and special characters

    - by mfn
    Server is PHP5 and HTML charset is latin1 (iso-8859-1). With regular form POST requests, there's no problem with "special" characters like the em dash (–) for example. Although I don't know for sure, it works. Probably because there exists a representable character for the browser at char code 150 (which is what I see in PHP on the server for a literal em dash with ord). Now our application also provides some kind of preview mechanism via ajax: the text is sent to the server and a complete HTML for a preview is sent back. However, the ordinary char code 150 em dash character when sent via ajax (tested with GET and POST) mutates into something more: %E2%80%93. I see this already in the apache log. According to various sources I found, e.g. http://www.tachyonsoft.com/uc0020.htm , this is the UTF8 byte representation of em dash and my current knowledge is that JavaScript handles everything in Unicode. However within my app, I need everything in latin1. Simply said: just like a regular POST request would have given me that em dash as char code 150, I would need that for the translated UTF8 representation too. That's were I'm failing, because with PHP on the server when I try to decode it with either utf8_decode(...) or iconv('UTF-8', 'iso-8859-1', ...) but in both cases I get a regular ? representing this character (and iconv also throws me a notice: Detected an illegal character in input string ). My goal is to find an automated solution, but maybe I'm trying to be überclever in this case? I've found other people simply doing manual replacing with a predefined input/output set; but that would always give me the feeling I could loose characters. The observant reader will note that I'm behind on understanding the full impact/complexity with things about Unicode and conversion of chars and I definitely prefer to understand the thing as a whole then a simply manual mapping. thanks

    Read the article

  • Issue encondig java->xls

    - by Xerg
    This is not a pure java question and can also be related to HTML I've written a java servlet that queries a database table and shows the result as a html table. The user can also ask to receive the result as an Excel sheet. Im creating the Excel sheet by printing the same html table, but with the content-type of "application/vnd.ms-excel". The Excel file is created fine. The problem is that the tables may contain non-english data so I want to use a UTF-8 encoding. PrintWriter out = response.getWriter(); response.setContentType("application/vnd.ms-excel:ISO-8859-1"); //response.setContentType("application/vnd.ms-excel:UTF-8"); response.setHeader("cache-control", "no-cache"); response.setHeader("Content-Disposition", "attachment; filename=file.xls"); out.print(src); out.flush(); The non-english characters appear as garbage (áéíóú) Also I tried converting to bytes from String byte[] arrByte = src.getBytes("ISO-8859-1"); String result = new String(arrByte, "UTF-8"); But I Still getting garbage, What can I do?. Thanks

    Read the article

  • My multipart email script sends HTML messages just fine, but the plain text alternative doesn't not

    - by hsatterwhite
    I have a script set up to send out multipart emails; plain text and html messages. The HTML messages work just fine, but when I used an email client that only does plain text the plaint text message does not render and I get the following: -- This message was generated automatically by Me http://www.somewebsite.com/ $html_msg = $message_details; $plain_text_msg = strip_tags($message_details); $headers = <<<HEADERS From: Me <[email protected]> MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="==PHP-alt$mime_boundary" HEADERS; // Use our boundary string to create plain text and HTML versions $message = <<<MESSAGE --==PHP-alt$mime_boundary Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: 7bit $plain_text_msg -- This message was generated automatically by Me http://www.somewebsite.com/ If you did not request this message, please notify [email protected] --==PHP-alt$mime_boundary Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: 7bit <html> <body> $html_msg <p> --<br /> This message was generated automatically as a demonstration on <a href="http://www.somewebsite.com/">Me</a> </p> <p> If you did not request this message, please notify <a href="mailto:[email protected]">[email protected]</a> </p> </body> </html> --==PHP-alt$mime_boundary-- MESSAGE;

    Read the article

  • IIS6 compressing static files, but not JSON requests

    - by user500038
    I have IIS6 compression setup, and static content is being compressed correctly as per Coding Horror. Example of a static file: Response Headers Content-Length 55513 Content-Type application/x-javascript Content-Encoding gzip Last-Modified Mon, 20 Dec 2010 15:31:58 GMT Accept-Ranges bytes Vary Accept-Encoding Server Microsoft-IIS/6.0 X-Powered-By ASP.NET Date Wed, 29 Dec 2010 16:37:23 GMT Request Headers Accept */* Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive As you can see, the response is correctly compressed. However with a JSON call you'll see the request with the gzip parameter correctly set: Accept application/json, text/javascript, */*; q=0.01 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive X-Requested-With XMLHttpRequest Then for the response: Cache-Control private Content-Length 6811 Content-Type application/json; charset=utf-8 Server Microsoft-IIS/6.0 X-Powered-By ASP.NET X-AspNet-Version 2.0.50727 X-AspNetMvc-Version 2.0 No gzip! I found this article by Rick Stahl, which outlines how to compress in your code, but I'd like IIS to handle this. Is this possible with IIS6?

    Read the article

  • How to make TXMLDocument (with the MSXML Implementation) always include the encoding attribute?

    - by Fabricio Araujo
    I have legacy code (I didn't write it) that always included the encoding attribute, but recompiling it to D2010, TXMLDocument doesn't include the enconding anymore. Because the XML data have accented characters both on tags and data, TXMLDocument.LoadFromFile simply throws EDOMParseErros saying that an invalid character is found on the file. Relevant code: Doc := TXMLDocument.Create(nil); try Doc.Active := True; Doc.Encoding := XMLEncoding; RootNode := Doc.CreateElement('Test', ''); Doc.DocumentElement := RootNode; <snip> //Result := Doc.XMl.Text; Doc.SaveToXML(Result); // Both lines gives the same result On older versions of Delphi, the following line is generated: <?xml version="1.0" encoding="ISO-8859-1"?> On D2010, this is generated: <?xml version="1.0"?> If I change manually the line, all works like always worked in the last years. UPDATE: XMLEncoding is a constant and is defined as follow XMLEncoding = 'ISO-8859-1';

    Read the article

  • Foreign/accented characters in sql query

    - by FromCanada
    I'm using Java and Spring's JdbcTemplate class to build an SQL query in Java that queries a Postgres database. However, I'm having trouble executing queries that contain foreign/accented characters. For example the (trimmed) code: JdbcTemplate select = new JdbcTemplate( postgresDatabase ); String query = "SELECT id FROM province WHERE name = 'Ontario';"; Integer id = select.queryForObject( query, Integer.class ); will retrieve the province id, but if instead I did name = 'Québec' then the query fails to return any results (this value is in the database so the problem isn't that it's missing). I believe the source of the problem is that the database I am required to use has the default client encoding set to SQL_ASCII, which according to this prevents automatic character set conversions. (The Java environments encoding is set to 'UTF-8' while I'm told the database uses 'LATIN1' / 'ISO-8859-1') I was able to manually indicate the encoding when the resultSets contained values with foreign characters as a solution to a previous problem with a similar nature. Ex: String provinceName = new String ( resultSet.getBytes( "name" ), "ISO-8859-1" ); But now that the foreign characters are part of the query itself this approach hasn't been successful. (I suppose since the query has to be saved in a String before being executed anyway, breaking it down into bytes and then changing the encoding only muddles the characters further.) Is there a way around this without having to change the properties of the database or reconstruct it? PostScript: I found this function on StackOverflow when making up a title, it didn't seem to work (I might not have used it correctly, but even if it did work it doesn't seem like it could be the best solution.):

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >