Search Results

Search found 4688 results on 188 pages for 'io redirection'.

Page 59/188 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • selectOneMenu - java.lang.NullPointerException when adding record to the database (JSF2 and JPA2-OpenJPA)

    - by rogie
    Good day to all; I'm developing a program using JSF2 and JPA2 (OpenJPA). Im also using IBM Rapid App Dev't v8 with WebSphere App Server v8 test server. I have two simple entities, Employee and Department. Each Department has many Employees and each Employee belongs to a Department (using deptno and workdept). My problem occurs when i tried to add a new employee and selecting a department from a combo box (using selectOneMenu - populated from Department table): when i run the program, the following error messages appeared: An Error Occurred: java.lang.NullPointerException Caused by: java.lang.NullPointerException - java.lang.NullPointerException I also tried to make another program using Deptno and Workdept as String instead of integer, still doesn't work. Pls help. Im also a newbie. Tnx and God bless. Below are my codes, configurations and setup. Just tell me if there are some codes that I forgot to include. Im also using Derby v10.5 as my database: CREATE SCHEMA RTS; CREATE TABLE RTS.DEPARTMENT (DEPTNO INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1), DEPTNAME VARCHAR(30)); ALTER TABLE RTS.DEPARTMENT ADD CONSTRAINT PK_DEPARTMNET PRIMARY KEY (DEPTNO); CREATE TABLE RTS.EMPLOYEE (EMPNO INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1), NAME VARCHAR(50), WORKDEPT INTEGER); ALTER TABLE RTS.EMPLOYEE ADD CONSTRAINT PK_EMPLOYEE PRIMARY KEY (EMPNO); Employee and Department Entities package rts.entities; import java.io.Serializable; import javax.persistence.*; @Entity public class Employee implements Serializable { private static final long serialVersionUID = 1L; @Id @GeneratedValue(strategy=GenerationType.IDENTITY) private int empno; private String name; //bi-directional many-to-one association to Department @ManyToOne @JoinColumn(name="WORKDEPT") private Department department; ....... getter and setter methods package rts.entities; import java.io.Serializable; import javax.persistence.*; import java.util.List; @Entity public class Department implements Serializable { private static final long serialVersionUID = 1L; @Id @GeneratedValue(strategy=GenerationType.IDENTITY) private int deptno; private String deptname; //bi-directional many-to-one association to Employee @OneToMany(mappedBy="department") private List<Employee> employees; ....... getter and setter methods JSF 2 snipet using combo box (populated from Department table) <tr> <td align="left">Department</td> <td style="width: 5px">&#160;</td> <td><h:selectOneMenu styleClass="selectOneMenu" id="department1" value="#{pc_EmployeeAdd.employee.department}"> <f:selectItems value="#{DepartmentManager.departmentSelectList}" id="selectItems1"></f:selectItems> </h:selectOneMenu></td> </tr> package rts.entities.controller; import com.ibm.jpa.web.JPAManager; import javax.persistence.EntityManager; import javax.persistence.EntityManagerFactory; import com.ibm.jpa.web.NamedQueryTarget; import com.ibm.jpa.web.Action; import javax.persistence.PersistenceUnit; import javax.annotation.Resource; import javax.transaction.UserTransaction; import rts.entities.Department; import java.util.List; import javax.persistence.Query; import java.util.ArrayList; import java.text.MessageFormat; import javax.faces.model.SelectItem; @SuppressWarnings("unchecked") @JPAManager(targetEntity = rts.entities.Department.class) public class DepartmentManager { ....... public List<SelectItem> getDepartmentSelectList() { List<Department> departmentList = getDepartment(); List<SelectItem> selectList = new ArrayList<SelectItem>(); MessageFormat mf = new MessageFormat("{0}"); for (Department department : departmentList) { selectList.add(new SelectItem(department, mf.format( new Object[] { department.getDeptname() }, new StringBuffer(), null).toString())); } return selectList; } Converter: package rts.entities.converter; import javax.faces.component.UIComponent; import javax.faces.context.FacesContext; import javax.faces.convert.Converter; import rts.entities.Department; import rts.entities.controller.DepartmentManager; import com.ibm.jpa.web.TypeCoercionUtility; public class DepartmentConverter implements Converter { public Object getAsObject(FacesContext facesContext, UIComponent arg1, String entityId) { DepartmentManager departmentManager = (DepartmentManager) facesContext .getApplication().createValueBinding("#{DepartmentManager}") .getValue(facesContext); int deptno = (Integer) TypeCoercionUtility.coerceType("int", entityId); Department result = departmentManager.findDepartmentByDeptno(deptno); return result; } public String getAsString(FacesContext arg0, UIComponent arg1, Object object) { if (object instanceof Department) { return "" + ((Department) object).getDeptno(); } else { throw new IllegalArgumentException("Invalid object type:" + object.getClass().getName()); } } } Method for Add button: public String createEmployeeAction() { EmployeeManager employeeManager = (EmployeeManager) getManagedBean("EmployeeManager"); try { employeeManager.createEmployee(employee); } catch (Exception e) { logException(e); } return ""; } faces-conf.xml <converter> <converter-for-class>rts.entities.Department</converter-for-class> <converter-class>rts.entities.converter.DepartmentConverter</converter-class> </converter> Stack trace javax.faces.FacesException: java.lang.NullPointerException at org.apache.myfaces.shared_impl.context.ExceptionHandlerImpl.wrap(ExceptionHandlerImpl.java:241) at org.apache.myfaces.shared_impl.context.ExceptionHandlerImpl.handle(ExceptionHandlerImpl.java:156) at org.apache.myfaces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:258) at javax.faces.webapp.FacesServlet.service(FacesServlet.java:191) at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1147) at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:722) at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:449) at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178) at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1020) at com.ibm.ws.webcontainer.servlet.CacheServletWrapper.handleRequest(CacheServletWrapper.java:87) at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:886) at com.ibm.ws.webcontainer.WSWebContainer.handleRequest(WSWebContainer.java:1655) at com.ibm.ws.webcontainer.channel.WCChannelLink.ready(WCChannelLink.java:195) at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:452) at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleNewRequest(HttpInboundLink.java:511) at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.processRequest(HttpInboundLink.java:305) at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.ready(HttpInboundLink.java:276) at com.ibm.ws.tcp.channel.impl.NewConnectionInitialReadCallback.sendToDiscriminators(NewConnectionInitialReadCallback.java:214) at com.ibm.ws.tcp.channel.impl.NewConnectionInitialReadCallback.complete(NewConnectionInitialReadCallback.java:113) at com.ibm.ws.tcp.channel.impl.AioReadCompletionListener.futureCompleted(AioReadCompletionListener.java:165) at com.ibm.io.async.AbstractAsyncFuture.invokeCallback(AbstractAsyncFuture.java:217) at com.ibm.io.async.AsyncChannelFuture.fireCompletionActions(AsyncChannelFuture.java:161) at com.ibm.io.async.AsyncFuture.completed(AsyncFuture.java:138) at com.ibm.io.async.ResultHandler.complete(ResultHandler.java:204) at com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775) at com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905) at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1650) Caused by: java.lang.NullPointerException at rts.entities.converter.DepartmentConverter.getAsString(DepartmentConverter.java:29) at org.apache.myfaces.shared_impl.renderkit.RendererUtils.getConvertedStringValue(RendererUtils.java:656) at org.apache.myfaces.shared_impl.renderkit.html.HtmlRendererUtils.getSubmittedOrSelectedValuesAsSet(HtmlRendererUtils.java:444) at org.apache.myfaces.shared_impl.renderkit.html.HtmlRendererUtils.internalRenderSelect(HtmlRendererUtils.java:421) at org.apache.myfaces.shared_impl.renderkit.html.HtmlRendererUtils.renderMenu(HtmlRendererUtils.java:359) at org.apache.myfaces.shared_impl.renderkit.html.HtmlMenuRendererBase.encodeEnd(HtmlMenuRendererBase.java:76) at javax.faces.component.UIComponentBase.encodeEnd(UIComponentBase.java:519) at javax.faces.component.UIComponent.encodeAll(UIComponent.java:626) at javax.faces.component.UIComponent.encodeAll(UIComponent.java:622) at javax.faces.component.UIComponent.encodeAll(UIComponent.java:622) at javax.faces.component.UIComponent.encodeAll(UIComponent.java:622) at org.apache.myfaces.view.facelets.FaceletViewDeclarationLanguage.renderView(FaceletViewDeclarationLanguage.java:1320) at org.apache.myfaces.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:263) at org.apache.myfaces.lifecycle.RenderResponseExecutor.execute(RenderResponseExecutor.java:85) at org.apache.myfaces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:239) ... 24 more

    Read the article

  • Spring - PropertiesPlaceholderConfigurer not finding properties file

    - by sat
    Not sure what could be wrong. I had an app that worked all along with this <context:property-placeholder location="classpath:my.properties"/> No problems finding the properties file and hooking things up. Now, I needed to encrypt some fields in the properties file. So I ended up writing the custom PropertiesPlaceholderConfigurer and tried to wire it up like this <bean class="com.mycompany.myapp.PropertiesPlaceholderConfigurer"> <property name="location" value="classpath:my.propeties"/> </bean> With this configuration, Spring complains that it cannot find the properties file. java.io.FileNotFoundException: class path resource [my.propeties] cannot be opened because it does not exist What in addition should be done? The custom placeholder configurer package com.mycompany.myapp; import org.springframework.beans.factory.config.PropertyPlaceholderConfigurer; import org.springframework.util.ObjectUtils; import java.util.Enumeration; import java.util.Properties; public class PropertiesPlaceholderConfigurer extends PropertyPlaceholderConfigurer{ @Override protected void convertProperties(Properties props) { Enumeration<?> propertyNames = props.propertyNames(); while (propertyNames.hasMoreElements()) { String propertyName = (String) propertyNames.nextElement(); String propertyValue = props.getProperty(propertyName); if(propertyName.endsWith("encrypted")){ System.out.println("Decrypting the property " + propertyName); String convertedValue = decrypt(propertyValue); System.out.println("Decrypted the property value to " + convertedValue); if (!ObjectUtils.nullSafeEquals(propertyValue, convertedValue)) { props.setProperty(propertyName, convertedValue); } } } } } Update: Forget my custom placeholder configurer, even the spring provided one has trouble if I replace with this <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"> <property name="location" value="classpath:my.propeties"/> </bean> What is context:property-placholder doing that the bean definition can't? Full stack trace java.lang.IllegalStateException: Failed to load ApplicationContext at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:99) at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) Caused by: org.springframework.beans.factory.BeanInitializationException: Could not load properties; nested exception is java.io.FileNotFoundException: class path resource [my.propeties] cannot be opened because it does not exist at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:89) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:265) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:162) at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:609) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:464) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:121) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:100) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:250) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContextInternal(CacheAwareContextLoaderDelegate.java:64) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:91) at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) Caused by: java.io.FileNotFoundException: class path resource [my.propeties] cannot be opened because it does not exist at org.springframework.core.io.ClassPathResource.getInputStream(ClassPathResource.java:158) at org.springframework.core.io.support.EncodedResource.getInputStream(EncodedResource.java:143) at org.springframework.core.io.support.PropertiesLoaderUtils.fillProperties(PropertiesLoaderUtils.java:98) at org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:175) at org.springframework.core.io.support.PropertiesLoaderSupport.mergeProperties(PropertiesLoaderSupport.java:156) at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:80) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:265) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:162) at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:609) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:464) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:121) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:100) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:250) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContextInternal(CacheAwareContextLoaderDelegate.java:64) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:91) at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)

    Read the article

  • How do I get the PreviewDialog of Apache FOP to actually display my document?

    - by JRSofty
    Search as I may I have not found a solution to my problem here and I'm hoping the combined minds of StackOverflow will push me in the right direction. My problem is as follows, I'm developing a print and print preview portion of a messaging system's user agent. I was given specific XSLT templates that after transforming XML will produce a Formatting Objects document. With Apache FOP I've been able to render the FO document into PDF which is all fine and good, but I would also like to display it in a print preview dialog. Apache FOP contains such a class called PreviewDialog which requires in its constructor a FOUserAgent, which I can generate, and an object implementing the Renderable Interface. The Renderable Interface has one implementing class in the FOP package which is called InputHandler which takes in its constructor a standard io File object. Now here is where the trouble begins. I'm currently storing the FO document as a temp file and pass this as a File object to an InputHandler instance which is then passed to the PreviewDialog. I see the dialog appear on my screen and along the bottom in a status bar it says that it is generating the document, and that is all it does. Here is the code I'm trying to use. It isn't production code so it's not pretty: import java.io.BufferedOutputStream; import java.io.File; import java.io.FileNotFoundException; import java.io.FileOutputStream; import java.io.IOException; import java.io.OutputStream; import java.util.Random; import javax.xml.transform.Result; import javax.xml.transform.Source; import javax.xml.transform.Transformer; import javax.xml.transform.TransformerConfigurationException; import javax.xml.transform.TransformerException; import javax.xml.transform.TransformerFactory; import javax.xml.transform.sax.SAXResult; import javax.xml.transform.stream.StreamResult; import javax.xml.transform.stream.StreamSource; import org.apache.fop.apps.FOPException; import org.apache.fop.apps.FOUserAgent; import org.apache.fop.apps.Fop; import org.apache.fop.apps.FopFactory; import org.apache.fop.cli.InputHandler; import org.apache.fop.render.awt.viewer.PreviewDialog; public class PrintPreview { public void showPreview(final File xslt, final File xmlSource) { boolean err = false; OutputStream out = null; Transformer transformer = null; final String tempFileName = this.getTempDir() + this.generateTempFileName(); final String tempFoFile = tempFileName + ".fo"; final String tempPdfFile = tempFileName + ".pdf"; System.out.println(tempFileName); final TransformerFactory transformFactory = TransformerFactory .newInstance(); final FopFactory fopFactory = FopFactory.newInstance(); try { transformer = transformFactory .newTransformer(new StreamSource(xslt)); final Source src = new StreamSource(xmlSource); out = new FileOutputStream(tempFoFile); final Result res = new StreamResult(out); transformer.transform(src, res); System.out.println("XSLT Transform Completed"); } catch (final TransformerConfigurationException e) { err = true; e.printStackTrace(); } catch (final FileNotFoundException e) { err = true; e.printStackTrace(); } catch (final TransformerException e) { err = true; e.printStackTrace(); } finally { if (out != null) { try { out.close(); } catch (final IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } System.out.println("Initializing Preview"); transformer = null; out = null; final File fo = new File(tempFoFile); final File pdf = new File(tempPdfFile); if (!err) { final FOUserAgent ua = fopFactory.newFOUserAgent(); try { transformer = transformFactory.newTransformer(); out = new FileOutputStream(pdf); out = new BufferedOutputStream(out); final Fop fop = fopFactory.newFop( MimeConstants.MIME_PDF, ua, out); final Source foSrc = new StreamSource(fo); final Result foRes = new SAXResult(fop.getDefaultHandler()); transformer.transform(foSrc, foRes); System.out.println("Transformation Complete"); } catch (final FOPException e) { err = true; e.printStackTrace(); } catch (final FileNotFoundException e) { err = true; e.printStackTrace(); } catch (final TransformerException e) { err = true; e.printStackTrace(); } finally { if (out != null) { try { out.close(); } catch (final IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } if (!err) { System.out.println("Attempting to Preview"); final InputHandler inputHandler = new InputHandler(fo); PreviewDialog.createPreviewDialog(ua, inputHandler, true); } } // perform the clean up // f.delete(); } private String getTempDir() { final String p = "java.io.tmpdir"; return System.getProperty(p); } private String generateTempFileName() { final String charset = "abcdefghijklmnopqrstuvwxyz1234567890abcdefghijklmnopqrstuvwxyz1234567890"; final StringBuffer sb = new StringBuffer(); Random r = new Random(); int seed = r.nextInt(); r = new Random(seed); for (int i = 0; i < 8; i++) { final int n = r.nextInt(71); seed = r.nextInt(); sb.append(charset.charAt(n)); r = new Random(seed); } return sb.toString(); } } Any help on this would be appreciated.

    Read the article

  • java: how to compress data into a String and uncompress data from the String

    - by Guillaume
    I want to put some compressed data into a remote repository. To put data on this repository I can only use a method that take the name of the resource and its content as a String. (like data.txt + "hello world"). The repository is moking a filesystem but is not, so I can not use File directly. I want to be able to do the following: client send to server a file 'data.txt' server compress 'data.txt' into data.zip server send to repository content of data.zip repository store data.zip client download from repository data.zip and his able to open it with its favorite zip tool I have tried a lots of compressing example found on the web but each time a send the data to the repository, my resulting zip file is corrupted. Here is a sample class, using the zip*stream and that emulate the repository showcasing my problem. The created zip file is working, but after its 'serialization' it's get corrupted. (the sample class use jakarta commons.io ) Many thanks for your help. package zip; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.util.zip.ZipEntry; import java.util.zip.ZipInputStream; import java.util.zip.ZipOutputStream; import org.apache.commons.io.FileUtils; /** * Date: May 19, 2010 - 6:13:07 PM * * @author Guillaume AME. */ public class ZipMe { public static void addOrUpdate(File zipFile, File ... files) throws IOException { File tempFile = File.createTempFile(zipFile.getName(), null); // delete it, otherwise you cannot rename your existing zip to it. tempFile.delete(); boolean renameOk = zipFile.renameTo(tempFile); if (!renameOk) { throw new RuntimeException("could not rename the file " + zipFile.getAbsolutePath() + " to " + tempFile.getAbsolutePath()); } byte[] buf = new byte[1024]; ZipInputStream zin = new ZipInputStream(new FileInputStream(tempFile)); ZipOutputStream out = new ZipOutputStream(new FileOutputStream(zipFile)); ZipEntry entry = zin.getNextEntry(); while (entry != null) { String name = entry.getName(); boolean notInFiles = true; for (File f : files) { if (f.getName().equals(name)) { notInFiles = false; break; } } if (notInFiles) { // Add ZIP entry to output stream. out.putNextEntry(new ZipEntry(name)); // Transfer bytes from the ZIP file to the output file int len; while ((len = zin.read(buf)) > 0) { out.write(buf, 0, len); } } entry = zin.getNextEntry(); } // Close the streams zin.close(); // Compress the files if (files != null) { for (File file : files) { InputStream in = new FileInputStream(file); // Add ZIP entry to output stream. out.putNextEntry(new ZipEntry(file.getName())); // Transfer bytes from the file to the ZIP file int len; while ((len = in.read(buf)) > 0) { out.write(buf, 0, len); } // Complete the entry out.closeEntry(); in.close(); } // Complete the ZIP file } tempFile.delete(); out.close(); } public static void main(String[] args) throws IOException { final String zipArchivePath = "c:/temp/archive.zip"; final String tempFilePath = "c:/temp/data.txt"; final String resultZipFile = "c:/temp/resultingArchive.zip"; File zipArchive = new File(zipArchivePath); FileUtils.touch(zipArchive); File tempFile = new File(tempFilePath); FileUtils.writeStringToFile(tempFile, "hello world"); addOrUpdate(zipArchive, tempFile); //archive.zip exists and contains a compressed data.txt that can be read using winrar //now simulate writing of the zip into a in memory cache String archiveText = FileUtils.readFileToString(zipArchive); FileUtils.writeStringToFile(new File(resultZipFile), archiveText); //resultingArchive.zip exists, contains a compressed data.txt, but it can not //be read using winrar: CRC failed in data.txt. The file is corrupt } }

    Read the article

  • java: how to get a string representation of a compressed byte array ?

    - by Guillaume
    I want to put some compressed data into a remote repository. To put data on this repository I can only use a method that take the name of the resource and its content as a String. (like data.txt + "hello world"). The repository is moking a filesystem but is not, so I can not use File directly. I want to be able to do the following: client send to server a file 'data.txt' server compress 'data.txt' into a compressed file 'data.zip' server send a string representation of data.zip to the repository repository store data.zip client download from repository data.zip and his able to open it with its favorite zip tool The problem arise at step 3 when I try to get a string representation of my compressed file. Here is a sample class, using the zip*stream and that emulate the repository showcasing my problem. The created zip file is working, but after its 'serialization' it's get corrupted. (the sample class use jakarta commons.io ) Many thanks for your help. package zip; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.util.zip.ZipEntry; import java.util.zip.ZipInputStream; import java.util.zip.ZipOutputStream; import org.apache.commons.io.FileUtils; /** * Date: May 19, 2010 - 6:13:07 PM * * @author Guillaume AME. */ public class ZipMe { public static void addOrUpdate(File zipFile, File ... files) throws IOException { File tempFile = File.createTempFile(zipFile.getName(), null); // delete it, otherwise you cannot rename your existing zip to it. tempFile.delete(); boolean renameOk = zipFile.renameTo(tempFile); if (!renameOk) { throw new RuntimeException("could not rename the file " + zipFile.getAbsolutePath() + " to " + tempFile.getAbsolutePath()); } byte[] buf = new byte[1024]; ZipInputStream zin = new ZipInputStream(new FileInputStream(tempFile)); ZipOutputStream out = new ZipOutputStream(new FileOutputStream(zipFile)); ZipEntry entry = zin.getNextEntry(); while (entry != null) { String name = entry.getName(); boolean notInFiles = true; for (File f : files) { if (f.getName().equals(name)) { notInFiles = false; break; } } if (notInFiles) { // Add ZIP entry to output stream. out.putNextEntry(new ZipEntry(name)); // Transfer bytes from the ZIP file to the output file int len; while ((len = zin.read(buf)) > 0) { out.write(buf, 0, len); } } entry = zin.getNextEntry(); } // Close the streams zin.close(); // Compress the files if (files != null) { for (File file : files) { InputStream in = new FileInputStream(file); // Add ZIP entry to output stream. out.putNextEntry(new ZipEntry(file.getName())); // Transfer bytes from the file to the ZIP file int len; while ((len = in.read(buf)) > 0) { out.write(buf, 0, len); } // Complete the entry out.closeEntry(); in.close(); } // Complete the ZIP file } tempFile.delete(); out.close(); } public static void main(String[] args) throws IOException { final String zipArchivePath = "c:/temp/archive.zip"; final String tempFilePath = "c:/temp/data.txt"; final String resultZipFile = "c:/temp/resultingArchive.zip"; File zipArchive = new File(zipArchivePath); FileUtils.touch(zipArchive); File tempFile = new File(tempFilePath); FileUtils.writeStringToFile(tempFile, "hello world"); addOrUpdate(zipArchive, tempFile); //archive.zip exists and contains a compressed data.txt that can be read using winrar //now simulate writing of the zip into a in memory cache String archiveText = FileUtils.readFileToString(zipArchive); FileUtils.writeStringToFile(new File(resultZipFile), archiveText); //resultingArchive.zip exists, contains a compressed data.txt, but it can not //be read using winrar: CRC failed in data.txt. The file is corrupt } }

    Read the article

  • Couldn't match expected type - Haskell Code

    - by wvyar
    I'm trying to learn Haskell, but the small bit of sample code I tried to write is running into a fairly large amount of "Couldn't match expected type" errors. Can anyone give me some guidance as to what I'm doing wrong/how I should go about this? These are the errors, but I'm not really sure how I should be writing my code. toDoSchedulerSimple.hs:6:14: Couldn't match expected type `[t0]' with actual type `IO String' In the return type of a call of `readFile' In a stmt of a 'do' block: f <- readFile inFile In the expression: do { f <- readFile inFile; lines f } toDoSchedulerSimple.hs:27:9: Couldn't match expected type `[a0]' with actual type `IO ()' In the return type of a call of `putStr' In a stmt of a 'do' block: putStr "Enter task name: " In the expression: do { putStr "Enter task name: "; task <- getLine; return inFileArray : task } toDoSchedulerSimple.hs:34:9: Couldn't match expected type `IO ()' with actual type `[a0]' In a stmt of a 'do' block: putStrLn "Your task is: " ++ (inFileArray !! i) In the expression: do { i <- randomRIO (0, (length inFileArray - 1)); putStrLn "Your task is: " ++ (inFileArray !! i) } In an equation for `getTask': getTask inFileArray = do { i <- randomRIO (0, (length inFileArray - 1)); putStrLn "Your task is: " ++ (inFileArray !! i) } toDoSchedulerSimple.hs:41:9: Couldn't match expected type `[a0]' with actual type `IO ()' In the return type of a call of `putStr' In a stmt of a 'do' block: putStr "Enter the task you would like to end: " In the expression: do { putStr "Enter the task you would like to end: "; task <- getLine; filter (endTaskCheck task) inFileArray } toDoSchedulerSimple.hs:60:53: Couldn't match expected type `IO ()' with actual type `[String] -> IO ()' In a stmt of a 'do' block: schedulerSimpleMain In the expression: do { (getTask inFileArray); schedulerSimpleMain } In a case alternative: "get-task" -> do { (getTask inFileArray); schedulerSimpleMain } This is the code itself. I think it's fairly straightforward, but the idea is to run a loop, take input, and perform actions based off of it by calling other functions. import System.Random (randomRIO) import Data.List (lines) initializeFile :: [char] -> [String] initializeFile inFile = do f <- readFile inFile let parsedFile = lines f return parsedFile displayHelp :: IO() displayHelp = do putStrLn "Welcome to To Do Scheduler Simple, written in Haskell." putStrLn "Here are some commands you might find useful:" putStrLn " 'help' : Display this menu." putStrLn " 'quit' : Exit the program." putStrLn " 'new-task' : Create a new task." putStrLn " 'get-task' : Randomly select a task." putStrLn " 'end-task' : Mark a task as finished." putStrLn " 'view-tasks' : View all of your tasks." quit :: IO() quit = do putStrLn "We're very sad to see you go...:(" putStrLn "Come back soon!" createTask :: [String] -> [String] createTask inFileArray = do putStr "Enter task name: " task <- getLine return inFileArray:task getTask :: [String] -> IO() getTask inFileArray = do i <- randomRIO (0, (length inFileArray - 1)) putStrLn "Your task is: " ++ (inFileArray !! i) endTaskCheck :: String -> String -> Bool endTaskCheck str1 str2 = str1 /= str2 endTask :: [String] -> [String] endTask inFileArray = do putStr "Enter the task you would like to end: " task <- getLine return filter (endTaskCheck task) inFileArray viewTasks :: [String] -> IO() viewTasks inFileArray = case inFileArray of [] -> do putStrLn "\nEnd of tasks." _ -> do putStrLn (head inFileArray) viewTasks (tail inFileArray) schedulerSimpleMain :: [String] -> IO() schedulerSimpleMain inFileArray = do putStr "SchedulerSimple> " input <- getLine case input of "help" -> displayHelp "quit" -> quit "new-task" -> schedulerSimpleMain (createTask inFileArray) "get-task" -> do (getTask inFileArray); schedulerSimpleMain "end-task" -> schedulerSimpleMain (endTask inFileArray) "view-tasks" -> do (viewTasks inFileArray); schedulerSimpleMain _ -> do putStrLn "Invalid input."; schedulerSimpleMain main :: IO() main = do putStr "What is the name of the schedule? " sName <- getLine schedulerSimpleMain (initializeFile sName) Thanks, and apologies if this isn't the correct place to be asking such a question.

    Read the article

  • rsync on QNAP NAS fails recently

    - by user192702
    I have been using rsync to copy a large backup file from a remote host to my QNAP NAS. It's been working fine until recently. It seems like almost every time when it executes it's giving a time out after 15s. Following is what I have captured in the log. Any ideas? 2013-11-10 23:10:01 HKT - Executing: rsync -t -v -e ssh [email protected]:/home/backup/backup/backup_file-11102013* /share/homes/backup/backup/web/database [receiver] io timeout after 10 seconds -- exiting rsync error: timeout in data send/receive (code 30) at io.c(140) [receiver=3.0.7] rsync: connection unexpectedly closed (73 bytes received so far) [generator] rsync error: error in rsync protocol data stream (code 12) at io.c(601) [generator=3.0.7] 2013-11-10 23:10:15 HKT - Done rsync

    Read the article

  • What does 'Highest active time' for disk activity in Windows resource monitor mean?

    - by Nick R
    I know what the disk io, disk queue length and other measures are, but what does 'Highest active time' mean? Is it the amount of time it is busy handling requests, or something else? When it is high, does it mean the CPU is busy doing some IO work, or is it just indicating that the disk is busy handling requests? I'm trying to work out if 50% active time means that 50% of the time the disk is either seeking, reading or writing, rather than the kernel is spending 50% of it's time servicing IO requests. Edit Another quick data point here. If you look at the difference between an SSD and a physical disk, the SSD has significantly less activity, so I guess this really means the amount of time the operating system is waiting for the disk to respond and returning data.

    Read the article

  • Looping Redirect with PyFacebook and Google App Engine

    - by Nick Gotch
    I have a Python Facebook project hosted on Google App Engine and use the following code to handle initialization of the Facebook API using PyFacebook. # Facebook Initialization def initialize_facebook(f): # Redirection handler def redirect(self, url): logger.info('Redirecting the user to: ' + url) self.response.headers.add_header("Cache-Control", "max-age=0") self.response.headers.add_header("Pragma", "no-cache") self.response.out.write('<html><head><script>parent.location.replace(\'' + url + '\');</script></head></html>') return 'Moved temporarily' auth_token = request.params.get('auth_token', None) fbapi = Facebook(settings['FACEBOOK_API_KEY'], settings['FACEBOOK_SECRET_KEY'], auth_token=auth_token) if not fbapi: logger.error('Facebook failed to initialize') if fbapi.check_session(request) or auth_token: pass else: logger.info('User not logged into Facebook') return lambda a: redirect(a, fbapi.get_login_url()) if fbapi.added: pass else: logger.info('User does not have ' + settings['FACEBOOK_APP_NAME'] + ' added') return lambda a: redirect(a, fbapi.get_add_url()) # Return the validated API logger.info('Facebook successfully initialized') return lambda a: f(a, fbapi=fbapi) I'm trying to set it up so that I can drop this decorator on any page handler method and verify that the user has everything set up correctly. The issue is that when the redirect handler gets called, it starts an infinite loop of redirection. I tried using an HTTP 302 redirection in place of the JavaScript but that kept failing too. Does anyone know what I can do to fix this? I saw this similar question but there are no answers.

    Read the article

  • PHP/Java bridge problem

    - by Jack
    I am using tomcat 6 on windows. Here is the code I am testing. import java.io.ByteArrayOutputStream; import java.io.Closeable; import java.io.StringReader; import javax.script.Invocable; import javax.script.ScriptEngine; import javax.script.ScriptEngineManager; /** * Create and run THREAD_COUNT PHP threads, concurrently accessing a * shared resource. * * Create 5 script engines, passing each a shared resource allocated * from Java. Each script engine has to implement Runnable. * * Java accesses the Runnable script engine using * scriptEngine.getInterface() and calls thread.start() to invoke each * PHP Runnable implementations concurrently. */ class PhpThreads { public static final String runnable = new String("<?php\n" + "function run() {\n" + " $out = java_context()->getAttribute('sharedResource', 100);\n" + " $nr = (string)java_context()->getAttribute('nr', 100);\n" + " echo \"started thread: $nr\n\";\n" + " for($i=0; $i<100; $i++) {\n" + " $out->write(ord($nr));\n" + " java('java.lang.Thread')->sleep(1);\n" + " }\n" + "}\n" + "?>\n"); static final int THREAD_COUNT = 5; public static void main(String[] args) throws Exception { ScriptEngineManager manager = new ScriptEngineManager(); Thread threads[] = new Thread[THREAD_COUNT]; ScriptEngine engines[] = new ScriptEngine[THREAD_COUNT]; ByteArrayOutputStream sharedResource = new ByteArrayOutputStream(); StringReader runnableReader = new StringReader(runnable); // create THREAD_COUNT PHP threads for (int i=0; i<THREAD_COUNT; i++) { engines[i] = manager.getEngineByName("php-invocable"); if (engines[i] == null) throw new NullPointerException ("php script engine not found"); engines[i].put("nr", new Integer(i+1)); engines[i].put("sharedResource", sharedResource); engines[i].eval(runnableReader); runnableReader.reset(); // cast the whole script to Runnable; note also getInterface(specificClosure, type) Runnable r = (Runnable) ((Invocable)engines[i]).getInterface(Runnable.class); threads[i] = new Thread(r); } // run the THREAD_COUNT PHP threads for (int i=0; i<THREAD_COUNT; i++) { threads[i].start(); } // wait for the THREAD_COUNT PHP threads to finish for (int i=0; i<THREAD_COUNT; i++) { threads[i].join(); ((Closeable)engines[i]).close(); } // print the output generated by the THREAD_COUNT concurrent threads String result = sharedResource.toString(); System.out.println(result); // Check result Object res=manager.getEngineByName("php").eval( "<?php " + "exit((int)('10011002100310041005'!=" + "@system(\"echo -n "+result+"|sed 's/./&\\\n/g'|sort|uniq -c|tr -d ' \\\n'\")));" + "?>"); System.exit(((Number)res).intValue()); } } I have added all the libraries. When I run the file I get the following error - run: Exception in thread "main" javax.script.ScriptException: java.io.IOException: Cannot run program "php-cgi": CreateProcess error=2, The system cannot find the file specified at php.java.script.InvocablePhpScriptEngine.eval(InvocablePhpScriptEngine.java:209) at php.java.script.SimplePhpScriptEngine.eval(SimplePhpScriptEngine.java:178) at javax.script.AbstractScriptEngine.eval(AbstractScriptEngine.java:232) at PhpThreads.main(NewClass.java:53) Caused by: java.io.IOException: Cannot run program "php-cgi": CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessBuilder.start(ProcessBuilder.java:459) at java.lang.Runtime.exec(Runtime.java:593) at php.java.bridge.Util$Process.start(Util.java:1064) at php.java.bridge.Util$ProcessWithErrorHandler.start(Util.java:1166) at php.java.bridge.Util$ProcessWithErrorHandler.start(Util.java:1217) at php.java.script.CGIRunner.doRun(CGIRunner.java:126) at php.java.script.HttpProxy.doRun(HttpProxy.java:63) at php.java.script.CGIRunner.run(CGIRunner.java:111) at php.java.bridge.ThreadPool$Delegate.run(ThreadPool.java:60) Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessImpl.create(Native Method) at java.lang.ProcessImpl.<init>(ProcessImpl.java:81) at java.lang.ProcessImpl.start(ProcessImpl.java:30) at java.lang.ProcessBuilder.start(ProcessBuilder.java:452) ... 8 more What am I missing?

    Read the article

  • Debian apt dependency mismatch (libc6)

    - by Sean Gordon
    Earlier, I tried to install package via apt-get (cython), but it failed with the Errors were encountered while processing: message, and since then, apt is refusing to install anything. apt-get check output below: root@dix:~# apt-get check Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these. The following packages have unmet dependencies: libc6 : Depends: libc-bin (= 2.11.3-2) but 2.11.3-4 is installed libc6-dev : Depends: libc6 (= 2.11.3-4) but 2.11.3-2 is installed libc6-i386 : Depends: libc6 (= 2.11.3-4) but 2.11.3-2 is installed E: Unmet dependencies. Try using -f. Apt/aptitude don't seem to be able to fix this dependency issue, and I don't know what to do. Edit: Running apt-get -f install results in no change, and my sources are all squeeze. Running apt-get update then apt-get dist-upgrade show no change either. Edit 2: I went back to try this again in a new terminal and apt-get -f install gives this error: dpkg: error processing /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb (--unpack): subprocess new pre-installation script killed by signal (Aborted) configured to not write apport reports Errors were encountered while processing: /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) Edit 3: Using apt-get clean first, then the previous commands, results in the first error again. Using apt-get -f dist-upgrade gives the below. Reading package lists... Building dependency tree... Reading state information... Correcting dependencies... Done The following packages will be upgraded: apache2 apache2-mpm-prefork apache2-utils apache2.2-bin apache2.2-common at automake base-files bind9 bind9-doc bind9-host bind9utils debian-archive-keyring dnsutils dpkg-dev file host initscripts isc-dhcp-client isc-dhcp-common krb5-multidev libapr1 libbind9-60 libc6 libdns69 libdpkg-perl libexpat1 libexpat1-dev libgc1c2 libgssapi-krb5-2 libgssrpc4 libisc62 libisccc60 libisccfg62 libk5crypto3 libkadm5clnt-mit7 libkadm5srv-mit7 libkdb5-4 libkrb5-3 libkrb5-dev libkrb5support0 liblwres60 libmagic1 libmysqlclient16 libnss3-1d libssl-dev libssl0.9.8 libtiff4 libtiff4-dev libtiffxx0c2 libxi6 libxml2 linux-libc-dev lwresd mysql-client-5.1 mysql-common mysql-server mysql-server-5.1 mysql-server-core-5.1 openjdk-6-jre openjdk-6-jre-headless openjdk-6-jre-lib openssh-client openssh-server openssl procps python python-crypto python-minimal sudo sysv-rc sysvinit sysvinit-utils tzdata tzdata-java 75 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 5 not fully installed or removed. Need to get 0 B/79.9 MB of archives. After this operation, 1,411 kB of additional disk space will be used. (Reading database ... 52241 files and directories currently installed.) Preparing to replace libc6 2.11.3-2 (using .../libc6_2.11.3-4_amd64.deb) ... *** stack smashing detected ***: /usr/bin/perl terminated ======= Backtrace: ========= /lib/libc.so.6(__fortify_fail+0x37)[0x7fdaad9b9f87] /lib/libc.so.6(__fortify_fail+0x0)[0x7fdaad9b9f50] /usr/lib/libperl.so.5.10(Perl_yylex+0x5896)[0x7fdaae343346] [0x8e83a0] ======= Memory map: ======== 00400000-00402000 r-xp 00000000 08:01 525338 /usr/bin/perl 00601000-00602000 rw-p 00001000 08:01 525338 /usr/bin/perl 00602000-0091f000 rw-p 00000000 00:00 0 [heap] 7fdaaca54000-7fdaaca6a000 r-xp 00000000 08:01 393818 /lib/libgcc_s.so.1 7fdaaca6a000-7fdaacc69000 ---p 00016000 08:01 393818 /lib/libgcc_s.so.1 7fdaacc69000-7fdaacc6a000 rw-p 00015000 08:01 393818 /lib/libgcc_s.so.1 7fdaacc6a000-7fdaacc6f000 r-xp 00000000 08:01 524949 /usr/lib/perl5/auto/Locale/gettext/gettext.so 7fdaacc6f000-7fdaace6e000 ---p 00005000 08:01 524949 /usr/lib/perl5/auto/Locale/gettext/gettext.so 7fdaace6e000-7fdaace6f000 rw-p 00004000 08:01 524949 /usr/lib/perl5/auto/Locale/gettext/gettext.so 7fdaace6f000-7fdaace79000 r-xp 00000000 08:01 532753 /usr/lib/perl/5.10.1/auto/Encode/Encode.so 7fdaace79000-7fdaad078000 ---p 0000a000 08:01 532753 /usr/lib/perl/5.10.1/auto/Encode/Encode.so 7fdaad078000-7fdaad079000 rw-p 00009000 08:01 532753 /usr/lib/perl/5.10.1/auto/Encode/Encode.so 7fdaad079000-7fdaad07e000 r-xp 00000000 08:01 525444 /usr/lib/perl/5.10.1/auto/IO/IO.so 7fdaad07e000-7fdaad27d000 ---p 00005000 08:01 525444 /usr/lib/perl/5.10.1/auto/IO/IO.so 7fdaad27d000-7fdaad27e000 rw-p 00004000 08:01 525444 /usr/lib/perl/5.10.1/auto/IO/IO.so 7fdaad27e000-7fdaad299000 r-xp 00000000 08:01 525450 /usr/lib/perl/5.10.1/auto/POSIX/POSIX.so 7fdaad299000-7fdaad498000 ---p 0001b000 08:01 525450 /usr/lib/perl/5.10.1/auto/POSIX/POSIX.so 7fdaad498000-7fdaad49b000 rw-p 0001a000 08:01 525450 /usr/lib/perl/5.10.1/auto/POSIX/POSIX.so 7fdaad49b000-7fdaad49e000 r-xp 00000000 08:01 525436 /usr/lib/perl/5.10.1/auto/Fcntl/Fcntl.so 7fdaad49e000-7fdaad69e000 ---p 00003000 08:01 525436 /usr/lib/perl/5.10.1/auto/Fcntl/Fcntl.so 7fdaad69e000-7fdaad69f000 rw-p 00003000 08:01 525436 /usr/lib/perl/5.10.1/auto/Fcntl/Fcntl.so 7fdaad69f000-7fdaad6a7000 r-xp 00000000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad6a7000-7fdaad8a6000 ---p 00008000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad8a6000-7fdaad8a7000 r--p 00007000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad8a7000-7fdaad8a8000 rw-p 00008000 08:01 393824 /lib/libcrypt-2.11.3.so 7fdaad8a8000-7fdaad8d6000 rw-p 00000000 00:00 0 7fdaad8d6000-7fdaada2f000 r-xp 00000000 08:01 393822 /lib/libc-2.11.3.so 7fdaada2f000-7fdaadc2e000 ---p 00159000 08:01 393822 /lib/libc-2.11.3.so 7fdaadc2e000-7fdaadc32000 r--p 00158000 08:01 393822 /lib/libc-2.11.3.so 7fdaadc32000-7fdaadc33000 rw-p 0015c000 08:01 393822 /lib/libc-2.11.3.so 7fdaadc33000-7fdaadc38000 rw-p 00000000 00:00 0 7fdaadc38000-7fdaadc4f000 r-xp 00000000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaadc4f000-7fdaade4e000 ---p 00017000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaade4e000-7fdaade4f000 r--p 00016000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaade4f000-7fdaade50000 rw-p 00017000 08:01 393248 /lib/libpthread-2.11.3.so 7fdaade50000-7fdaade54000 rw-p 00000000 00:00 0 7fdaade54000-7fdaaded4000 r-xp 00000000 08:01 393826 /lib/libm-2.11.3.so 7fdaaded4000-7fdaae0d4000 ---p 00080000 08:01 393826 /lib/libm-2.11.3.so 7fdaae0d4000-7fdaae0d5000 r--p 00080000 08:01 393826 /lib/libm-2.11.3.so 7fdaae0d5000-7fdaae0d6000 rw-p 00081000 08:01 393826 /lib/libm-2.11.3.so 7fdaae0d6000-7fdaae0d8000 r-xp 00000000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae0d8000-7fdaae2d8000 ---p 00002000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae2d8000-7fdaae2d9000 r--p 00002000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae2d9000-7fdaae2da000 rw-p 00003000 08:01 393825 /lib/libdl-2.11.3.so 7fdaae2da000-7fdaae43f000 r-xp 00000000 08:01 525387 /usr/lib/libperl.so.5.10.1 7fdaae43f000-7fdaae63e000 ---p 00165000 08:01 525387 /usr/lib/libperl.so.5.10.1 7fdaae63e000-7fdaae647000 rw-p 00164000 08:01 525387 /usr/lib/libperl.so.5.10.1 7fdaae647000-7fdaae665000 r-xp 00000000 08:01 393819 /lib/ld-2.11.3.so 7fdaae854000-7fdaae859000 rw-p 00000000 00:00 0 7fdaae862000-7fdaae864000 rw-p 00000000 00:00 0 7fdaae864000-7fdaae865000 r--p 0001d000 08:01 393819 /lib/ld-2.11.3.so 7fdaae865000-7fdaae866000 rw-p 0001e000 08:01 393819 /lib/ld-2.11.3.so 7fdaae866000-7fdaae867000 rw-p 00000000 00:00 0 7fff9616d000-7fff9618e000 rw-p 00000000 00:00 0 [stack] 7fff961ff000-7fff96200000 r-xp 00000000 00:00 0 [vdso] ffffffffff600000-ffffffffff601000 r--p 00000000 00:00 0 [vsyscall] dpkg: error processing /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb (--unpack): subprocess new pre-installation script killed by signal (Aborted) Errors were encountered while processing: /var/cache/apt/archives/libc6_2.11.3-4_amd64.deb

    Read the article

  • Playing an Online mp3

    - by Mohsen
    I have a problem with playing online mp3s. I'm using latest version of javazoom's jlayer and basicplayer. Here is the exception: Caused by: javazoom.jlgui.basicplayer.BasicPlayerException: java.io.EOFException at javazoom.jlgui.basicplayer.BasicPlayer.initAudioInputStream(Unknown Source) at javazoom.jlgui.basicplayer.BasicPlayer.open(Unknown Source) ... 12 more Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:375) at com.sun.media.sound.WaveFileReader.getFMT(WaveFileReader.java:244) at com.sun.media.sound.WaveFileReader.getAudioFileFormat(WaveFileReader.java:85) at javax.sound.sampled.AudioSystem.getAudioFileFormat(AudioSystem.java:985) at javazoom.jlgui.basicplayer.BasicPlayer.initAudioInputStream(Unknown Source) ... 15 more My java is 1.6.0_16. Certain files cannot be player through the Internet. I have a set of mp3s, playing one after the other. Randomly one mp3 doesn't work throwing above exception. Some mp3s can be played by calling again play() method if javazoom's basicplayer, but others can never be played online. I was able to find this post but I doubt if this really relates to my directx version or something. Mohsen

    Read the article

  • please explain my fio results - is O_SYNC|O_DIRECT misbehaving on linux?

    - by Zoltan
    I'm going mad over figuring out what the problem could be with one of our storage boxes. With a simple fio script I'm testing random writes using bs=1M and direct=1. The SSD is a Samsung 840pro attached to an LSI HBA (3Gbit/s ports). This is the result I'm getting under FreeBSD 9.1: WRITE: io=13169MB, aggrb=224743KB/s, minb=224743KB/s, maxb=224743KB/s, mint=60002msec, maxt=60002msec This is regardless of sync being set to 0 or 1. On linux, this is the result with sync=0: WRITE: io=14828MB, aggrb=253060KB/s, minb=253060KB/s, maxb=253060KB/s, mint=60001msec, maxt=60001msec and with sync=1: WRITE: io=6360.0MB, aggrb=108542KB/s, minb=108542KB/s, maxb=108542KB/s, mint=60001msec, maxt=60001msec My understanding is that since I'm operating on the raw block device, O_SYNC should not make any difference - there's no filesystem, any barrier, anything between the writes and the drive itself. Especially with O_DIRECT|O_SYNC set. Any ideas? For reference, here's the fio script I'm testing with: [global] bs=1M ioengine=sync iodepth=4 size=16g direct=1 runtime=60 filename=/dev/sdh sync=1 [rand-write] rw=randwrite stonewall

    Read the article

  • Implement OAuth in Java

    - by phineas
    I made an an attempt to implement OAuth for my programming idea in Java, but I failed miserably. I don't know why, but my code doesn't work. Every time I run my program, an IOException is thrown with the reason "java.io.IOException: Server returned HTTP response code: 401" (401 means Unauthorized). I had a close look at the docs, but I really don't understand why it doesn't work. My OAuth provider I wanted to use is twitter, where I've registered my app, too. Thanks in advance phineas OAuth docs Twitter API wiki Class Base64Coder import java.io.InputStreamReader; import java.io.BufferedReader; import java.io.OutputStream; import java.io.IOException; import java.io.UnsupportedEncodingException; import java.net.URL; import java.net.URLEncoder; import java.net.URLConnection; import java.net.MalformedURLException; import javax.crypto.Mac; import javax.crypto.spec.SecretKeySpec; import java.security.NoSuchAlgorithmException; import java.security.InvalidKeyException; public class Request { public static String read(String url) { StringBuffer buffer = new StringBuffer(); try { /** * get the time - note: value below zero * the millisecond value is used for oauth_nonce later on */ int millis = (int) System.currentTimeMillis() * -1; int time = (int) millis / 1000; /** * Listing of all parameters necessary to retrieve a token * (sorted lexicographically as demanded) */ String[][] data = { {"oauth_callback", "SOME_URL"}, {"oauth_consumer_key", "MY_CONSUMER_KEY"}, {"oauth_nonce", String.valueOf(millis)}, {"oauth_signature", ""}, {"oauth_signature_method", "HMAC-SHA1"}, {"oauth_timestamp", String.valueOf(time)}, {"oauth_version", "1.0"} }; /** * Generation of the signature base string */ String signature_base_string = "POST&"+URLEncoder.encode(url, "UTF-8")+"&"; for(int i = 0; i < data.length; i++) { // ignore the empty oauth_signature field if(i != 3) { signature_base_string += URLEncoder.encode(data[i][0], "UTF-8") + "%3D" + URLEncoder.encode(data[i][1], "UTF-8") + "%26"; } } // cut the last appended %26 signature_base_string = signature_base_string.substring(0, signature_base_string.length()-3); /** * Sign the request */ Mac m = Mac.getInstance("HmacSHA1"); m.init(new SecretKeySpec("CONSUMER_SECRET".getBytes(), "HmacSHA1")); m.update(signature_base_string.getBytes()); byte[] res = m.doFinal(); String sig = String.valueOf(Base64Coder.encode(res)); data[3][1] = sig; /** * Create the header for the request */ String header = "OAuth "; for(String[] item : data) { header += item[0]+"=\""+item[1]+"\", "; } // cut off last appended comma header = header.substring(0, header.length()-2); System.out.println("Signature Base String: "+signature_base_string); System.out.println("Authorization Header: "+header); System.out.println("Signature: "+sig); String charset = "UTF-8"; URLConnection connection = new URL(url).openConnection(); connection.setDoInput(true); connection.setDoOutput(true); connection.setRequestProperty("Accept-Charset", charset); connection.setRequestProperty("Content-Type", "application/x-www-form-urlencoded;charset=" + charset); connection.setRequestProperty("Authorization", header); connection.setRequestProperty("User-Agent", "XXXX"); OutputStream output = connection.getOutputStream(); output.write(header.getBytes(charset)); BufferedReader reader = new BufferedReader(new InputStreamReader(connection.getInputStream())); String read; while((read = reader.readLine()) != null) { buffer.append(read); } } catch(Exception e) { e.printStackTrace(); } return buffer.toString(); } public static void main(String[] args) { System.out.println(Request.read("http://api.twitter.com/oauth/request_token")); } }

    Read the article

  • how to intercept processing when Session.IsNewSession is true

    - by Cen
    I have a small 4-page application for my customers to work through. They fill out information. If they let it sit too long, and the Session timeout out, I want to pop up a javascript alert that their session has expired, and that they need to start over. At that point, then redirected to the beginning page of the application. I'm getting some strange behavior. I'm stepping through code, forcing my Sessioni.IsNewSession to be true. At this point, I write out a call to Javascript to a Literal Control placed at the bottom of the . The javascript is called, and the redirection occurs. However, what is happening is.. I am pressing a button which is more or less a "Next Page" button and triggering this code. The next page is being displayed, and then the Alert and redirection occurs. The result I was expecting was to stay on the same page I received the "Timeout", with the alert to pop-up over it, then redirection. I'm checking for Session.IsNewSession in a BaseClass for these pages, overriding the OnInit event. Any ideas why I am getting this behavior? Thanks!

    Read the article

  • haproxy access list using path_dir having issues with firefox

    - by user11243
    I'm trying to route all requests containing a path directory of /socket.io/ to a separate port with HAProxy. Here is my config file: global maxconn 4096 # Total Max Connections. This is dependent on ulimit nbproc 2 defaults mode http frontend all 0.0.0.0:80 timeout client 86400000 default_backend web_servers acl is_stream path_dir socket.io use_backend stream_servers if is_stream backend web_servers balance roundrobin option forwardfor # This sets X-Forwarded-For timeout server 30000 timeout connect 4000 server web1 127.0.0.1:4000 weight 1 maxconn 1024 check backend stream_servers balance roundrobin option forwardfor # This sets X-Forwarded-For timeout queue 5000 timeout server 86400000 timeout connect 86400000 server stream1 127.0.0.1:5100 weight 1 maxconn 1024 check URL paths with a /socket.io/ get correctly directed to port 5100 in chrome and safari. However not for firefox. I'm running Haproxy locally on my mac for dev, not sure if it has anything to do with it. I'm using haproxy 1.4.8 and Firefox 3.6.15. I've tried clearing cache on firefox and it didn't help, so I'm thinking there's something wrong with the way HAProxy parses through the Firefox request headers.

    Read the article

  • Redis Cookbook Chat Recipe

    - by Tommy Kennedy
    I am a new starter to Node.Js and Redis. I got the Redis cookbook and was trying out the Chat client & Server recipe. I was wondering if anybody got the code to work or if there is some bug in the code. I dont see where the sent messages from the client get invoked on the server. Any help would be great. Regards, Tom Client Code: <?php ?> <html> <head> <title></title> <script src="http://192.168.0.118:8000/socket.io/socket.io.js"></script> <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.6.1/jquery.min.js"></script> <script> var socket = io.connect('192.168.0.118',{port:8000}); socket.on('message', function(data){ alert(data); //var li = new Element('li').insert(data); //$('messages').insert({top: li}); }); </script> </head> <body> <ul id="messages"> <!-- chat messages go here --> </ul> <form id="chatform" action=""> <input id="chattext" type="text" value="" /> <input type="submit" value="Send" /> </form> <script> $('#chatform').submit(function() { socket.emit('message', 'test'); //$('chattext').val()); $('chattext').val(""); // cleanup the field return false; }); </script> </body> </html> Server Code: var http = require('http'); io = require('socket.io'); redis = require('redis'); rc = redis.createClient(); //rc1 = redis.createClient(); rc.on("connect",function(){ rc.subscribe("chat"); console.log("In Chat Stream"); }); rc.on("message",function (channel,message){ console.log("Sending hope: " + message); //rc1.publish("chat","hope"); socketio.sockets.emit('message',message); }); server = http.createServer(function(req,res){ res.writeHead(200,{'content-type':'text/html'}); res.end('<h1>hello world</h1>'); }); server.listen(8000); var socketio = io.listen(server);

    Read the article

  • How can I kill off a Python web app on GAE early following a redirect?

    - by Mike Hayes
    Hi Disclaimer: completely new to Python from a PHP background Ok I'm using Python on Google App Engine with Google's webapp framework. I have a function which I import as it contains things which need to be processed on each page. def some_function(self): if data['user'].new_user and not self.request.path == '/main/new': self.redirect('/main/new') This works fine when I call it, but how can I make sure the app is killed off after the redirection. I don't want anything else processing. For example I will do this: class Dashboard(webapp.RequestHandler): def get(self): some_function(self) #Continue with normal code here self.response.out.write('Some output here') I want to make sure that once the redirection is made in some_function() (which works fine), that no processing is done in the get() function following the redirection, nor is the "Some output here" outputted. What should I be looking at to make this all work properly? I can't just exit the script because the webapp framework needs to run. I realise that more than likely I'm just doing things in completely the wrong way any way for a Python app, so any guidance would be a great help. Hopefully I have explained myself properly and someone will be able to point me in the right direction. Thanks

    Read the article

  • RSync over SSH hangs and fails with timeout

    - by tx2
    Client: Gentoo, GCC 4.3.4, RSync 3.0.9 Server: Ubuntu 10.04.4 LTS, RSync 3.0.7 Client and server connectet through is Internet, about 2Mbps. Ping is ok. RSync called on any files in any direction hangs on random file, then, after timeout, fails with: [sender] io timeout after 30 seconds -- exiting rsync error: timeout in data send/receive (code 30) at io.c(140) [sender=3.0.9] [sender] _exit_cleanup(code=30, file=io.c, line=140): about to call exit(30) In 1/10 trys is pass correctly. I've tryed to add SSH options TcpRcvBufPoll=yes, KeepAlive=yes; disable and enable rsync compression -- no changes. How can i make rsync works properly?

    Read the article

  • VPS stops responding every now and again

    - by Or W
    I have a Linode vps that I use to host some of my websites on. It's Ubuntu based and it's up to date in terms of all packages. I don't have any cron jobs scheduled or any automatic processes. I host a few (up to date) wordpress blogs there that have very little traffic altogether. Every day (at a different time) my server stops responding, I can't SSH to it, web access is getting timed out and it just dies until I reboot it through the Linode manager. On the linode dashboard I can see that the CPU is not very high (2-3%) Incoming/Outgoing traffic is on 0 and the IO count has a spike just before the server stops responding (SWAP IO is at 2k and IO Rate is at 5k). When I reboot the server everything is just fine. I'm trying to figure out a way to analyze what's going on at these random times where the server freezes up. How can I determine the problem?

    Read the article

  • Nightly backups (and maybe other tasks) causing server alerts

    - by J. Pablo Fernández
    I have two independent alert notification systems for my servers. The server is a virtual machine on Linode and one of the alerts comes from Linode. The other monitoring system we use is New Relic. They are both watching out for IO utilization. Every night I get alerts from both of them as the server is using too much IO. I run quite a few tasks in the middle of the night but the one I confirmed that can cause IO-warnings is running the backups. The backup is done by s3cmd sync. I tried ionice but it still generates the warnings. Getting warnings every night reduces the efficacy of warnings when they happen for real. For Linode I could raise the level at which a warning is issued, but it might mean making the whole thing useless as the level is too high. What would be the proper solution for this?

    Read the article

  • Fixing up Configurations in BizTalk Solution Files

    - by Elton Stoneman
    Just a quick one this, but useful for mature BizTalk solutions, where over time the configuration settings can get confused, meaning Debug configurations building in Release mode, or Deployment configurations building in Development mode. That can cause issues in the build which aren't obvious, so it's good to fix up the configurations. It's time-consuming in VS or in a text editor, so this bit of PowerShell may come in useful - just substitute your own solution path in the $path variable: $path = 'C:\x\y\z\x.y.z.Integration.sln' $backupPath = [System.String]::Format('{0}.bak', $path) [System.IO.File]::Copy($path, $backupPath, $True) $sln = [System.IO.File]::ReadAllText($path)   $sln = $sln.Replace('.Debug|.NET.Build.0 = Deployment|.NET', '.Debug|.NET.Build.0 = Development|.NET') $sln = $sln.Replace('.Debug|.NET.Deploy.0 = Deployment|.NET', '.Debug|.NET.Deploy.0 = Development|.NET') $sln = $sln.Replace('.Debug|Any CPU.ActiveCfg = Deployment|.NET', '.Debug|Any CPU.ActiveCfg = Development|.NET') $sln = $sln.Replace('.Deployment|.NET.ActiveCfg = Debug|Any CPU', '.Deployment|.NET.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Any CPU.ActiveCfg = Debug|Any CPU', '.Deployment|Any CPU.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Any CPU.Build.0 = Debug|Any CPU', '.Deployment|Any CPU.Build.0 = Release|Any CPU') $sln = $sln.Replace('.Deployment|Mixed Platforms.ActiveCfg = Debug|Any CPU', '.Deployment|Mixed Platforms.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Mixed Platforms.Build.0 = Debug|Any CPU', '.Deployment|Mixed Platforms.Build.0 = Release|Any CPU') $sln = $sln.Replace('.Deployment|.NET.ActiveCfg = Debug|Any CPU', '.Deployment|.NET.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Debug|.NET.ActiveCfg = Deployment|.NET', '.Debug|.NET.ActiveCfg = Development|.NET')   [System.IO.File]::WriteAllText($path, $sln) The script creates a backup of the solution file first, and then fixes up all the configs to use the correct builds. It's a simple search and replace list, so if there are any patterns that need to be added let me know and I'll update the script. A RegEx replace would be neater, but when it comes to hacking solution files, I prefer the conservative approach of knowing exactly what you're changing.

    Read the article

  • Perl - can't flush STDOUT or STDERR

    - by Jim Salter
    Perl 5.14 from stock Ubuntu Precise repos. Trying to write a simple wrapper to monitor progress on copying from one stream to another: use IO::Handle; while ($bufsize = read (SOURCE, $buffer, 1048576)) { STDERR->printflush ("Transferred $xferred of $sendsize bytes\n"); $xferred += $bufsize; print TARGET $buffer; } This does not perform as expected (writing a line each time the 1M buffer is read). I end up seeing the first line (with a blank value of $xferred), and then the 7th and 8th lines (on an 8MB transfer). Been pounding my brains out on this for hours - I've read the perldocs, I've read the classic "Suffering from Buffering" article, I've tried everything from select and $|++ to IO::Handle to binmode (STDERR, "::unix") to you name it. I've also tried flushing TARGET with each line using IO::Handle (TARGET-flush). No dice. Has anybody else ever encountered this? I don't have any ideas left. Sleeping one second "fixes" the problem, but obviously I don't want to sleep a second every time I read a buffer just so my progress will output on the screen! FWIW, the problem is exactly the same whether I'm outputting to STDERR or STDOUT.

    Read the article

  • Updated Intel display driver causing errors when booting

    - by cdysthe
    I upgraded to to Graphics Installer 1.0.6 on my Ubuntu 14.04 and installed the drivers using the Intel Graphics Installer. The laptop is Intel Ivybridge powered with Intel HD graphics. It Optimus but I have disabled the Nvidia card in bios. The Intel Graphics Installer installs the package i915-3.15-3.13-dkms.deb which I assume is the updated driver. It causes a bunch of error messages when I boot. Here are the relevant errors from dmesg when I boot: [ 7.206151] drm: module verification failed: signature and/or required key missing - tainting kernel [ 7.208045] drm: module has bad taint, not creating trace events [ 7.336470] fb: conflicting fb hw usage inteldrmfb vs VESA VGA - removing generic driver [ 7.393854] [drm] Supports vblank timestamp caching Rev 2 (21.10.2013) [ 7.393855] [drm] Driver supports precise vblank timestamp query. [ 7.393921] vgaarb: device changed decodes: PCI:0000:00:02.0,olddecodes=io+mem,decodes=io+mem:owns=io+mem [ 7.505798] [drm] GMBUS [i915 gmbus dpb] timed out, falling back to bit banging on pin 5 [ 7.507233] init: Failed to obtain startpar-bridge instance: Unknown parameter: INSTANCE [ 7.944183] [drm:cpt_serr_int_handler] ERROR PCH transcoder A FIFO underrun [ 8.368479] i915 0000:00:02.0: fb0: inteldrmfb frame buffer device [ 8.368480] i915 0000:00:02.0: registered panic notifier [ 8.818416] [drm] Enabling RC6 states: RC6 on, RC6p on, RC6pp off What could the problem be and will it affect performance? I tried to remove the package and the errors went away but I'm then running and older driver I assume?

    Read the article

  • Why am I not getting an sRGB default framebuffer?

    - by Aaron Rotenberg
    I'm trying to make my OpenGL Haskell program gamma correct by making appropriate use of sRGB framebuffers and textures, but I'm running into issues making the default framebuffer sRGB. Consider the following Haskell program, compiled for 32-bit Windows using GHC and linked against 32-bit freeglut: import Foreign.Marshal.Alloc(alloca) import Foreign.Ptr(Ptr) import Foreign.Storable(Storable, peek) import Graphics.Rendering.OpenGL.Raw import qualified Graphics.UI.GLUT as GLUT import Graphics.UI.GLUT(($=)) main :: IO () main = do (_progName, _args) <- GLUT.getArgsAndInitialize GLUT.initialDisplayMode $= [GLUT.SRGBMode] _window <- GLUT.createWindow "sRGB Test" -- To prove that I actually have freeglut working correctly. -- This will fail at runtime under classic GLUT. GLUT.closeCallback $= Just (return ()) glEnable gl_FRAMEBUFFER_SRGB colorEncoding <- allocaOut $ glGetFramebufferAttachmentParameteriv gl_FRAMEBUFFER gl_FRONT_LEFT gl_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING print colorEncoding allocaOut :: Storable a => (Ptr a -> IO b) -> IO a allocaOut f = alloca $ \ptr -> do f ptr peek ptr On my desktop (Windows 8 64-bit with a GeForce GTX 760 graphics card) this program outputs 9729, a.k.a. gl_LINEAR, indicating that the default framebuffer is using linear color space, even though I explicitly requested an sRGB window. This is reflected in the rendering results of the actual program I'm trying to write - everything looks washed out because my linear color values aren't being converted to sRGB before being written to the framebuffer. On the other hand, on my laptop (Windows 7 64-bit with an Intel graphics chip), the program prints 0 (huh?) and I get an sRGB default framebuffer by default whether I request one or not! And on both machines, if I manually create a non-default framebuffer bound to an sRGB texture, the program correctly prints 35904, a.k.a. gl_SRGB. Why am I getting different results on different hardware? Am I doing something wrong? How can I get an sRGB framebuffer consistently on all hardware and target OSes?

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >