Search Results

Search found 7500 results on 300 pages for 'const char'.

Page 189/300 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • Error 0x8007007e When trying to mount WinPE 5 wim in Windows 7 using Powershell

    - by BigHomie
    Using ADK for Windows 8.1, and the DISM cmdlets that come with them. I have WMF 4.0 installed. My machine is Windows 7 x64 SP1, and I'm trying to mount the wim using PS C:\Users\BigHomie> Mount-WindowsImage -ImagePath 'C:\Program Files (x86)\Windows Kits\8.1\Assessment and Deployment Kit\Windows Preinstallation Environment\x86\ en-us\winpe.wim' -Path C:\WinPE_x86 -index 1 And receive the following error: Mount-WindowsImage : DismInitialize failed. Error code = 0x8007007e At line:1 char:1 + Mount-WindowsImage -ImagePath 'C:\Program Files (x86)\Windows Kits\8.1\Assessmen ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~ + CategoryInfo : NotSpecified: (:) [Mount-WindowsImage], COMExcep tion + FullyQualifiedErrorId : Microsoft.Dism.Commands.MountWindowsImageCommand Using dism.exe works fine. Update Forgetting I had this problem, I went to mount a wim using the Powershell ISE and actuallygot a visual error message about "C:\Program Files (x86)\Windows Kits\8.1\Assessment and Deployment Kit\Deployment Tools\x86\DISM\api-ms-win-downlevel-advapi32-l4-1-0.dll" not being installed. After checking that the dll did in fact exist in the folder I called regsvr32 and received another error message Will try reinstalling as recommended.

    Read the article

  • Can I use a multi-line function or control flow segment into the PowerShellFar editor console

    - by Justin Dearing
    If I am running farmanager with FarNet and PowerShellFar I can bring up a console of sorts by selecting F11 | .NET PowerShell | Editor Console. This console is based on the far editor. I can paste snippets of powershell scripts into this console and edit them. The only problem is if I want to use a multi-line function or control flow segment in the console. If I paste it in it has no effect. If I attempt to type one in I get an error similar to: ERROR: IncompleteParseException: Missing closing '}' in statement block. At line:1 char:42 + foreach ($number in 1..10 ) { $number * 7 <<<< + CategoryInfo : ParserError: (CloseBraceToken:TokenId) [], ParentContainsErrorRecordException + FullyQualifiedErrorId : MissingEndCurlyBrace Is this simply a limitation of PowerShellFar?

    Read the article

  • EXC_BAD_ACCESS in CFAttributedStringSetAttribute and NSNumber?

    - by RichardR
    Hi all, I am getting an infuriating EXC_BAD_ACCESS error in an objective c app I am working on. Any help you could offer would be much appreciated. I have tried the normal debug methods for this error (turning on NSZombieEnabled, checking retain/release/autorelease to make sure I'm not trying to access a deallocated object, etc.) and it hasn't seemed to help. Basically, the error always occurs in this function: ` void op_TJ(CGPDFScannerRef scanner, void *info) { PDFPage *self = info; CGPDFArrayRef array; NSMutableString *tempString = [NSMutableString stringWithCapacity:1]; NSMutableArray *kernArray = [[NSMutableArray alloc] initWithCapacity:1]; if(!CGPDFScannerPopArray(scanner, &array)) { [kernArray release]; return; } for(size_t n = 0; n < CGPDFArrayGetCount(array); n += 2) { if(n >= CGPDFArrayGetCount(array)) continue; CGPDFStringRef pdfString; // if we get a PDF string if (CGPDFArrayGetString(array, n, &pdfString)) { //get the actual string const unsigned char *charstring = CGPDFStringGetBytePtr(pdfString); //add this string to our temp string [tempString appendString:[NSString stringWithCString:(const char*)charstring encoding:[self pageEncoding]]]; //NSLog(@"string: %@", tempString); //get the space after this string CGPDFReal r = 0; if (n+1 < CGPDFArrayGetCount(array)) { CGPDFArrayGetNumber(array, n+1, &r); // multiply by the font size CGFloat k = r; k = -k/1000 * self.tmatrix.a * self.fontSize; CGFloat kKern = self.kern * self.tmatrix.a; k = k + kKern; // add the location and kern to the array NSNumber *tempKern = [NSNumber numberWithFloat:k]; NSLog(@"tempKern address: %p", tempKern); [kernArray addObject:[NSArray arrayWithObjects:[NSNumber numberWithInt:[tempString length] - 1], tempKern, nil]]; } } } // create an attribute string CFMutableAttributedStringRef attString = CFAttributedStringCreateMutable(kCFAllocatorDefault, 10); CFAttributedStringReplaceString(attString, CFRangeMake(0, 0), (CFStringRef)tempString); //apply overall kerning NSNumber *tkern = [NSNumber numberWithFloat:self.kern * self.tmatrix.a * self.fontSize]; CFAttributedStringSetAttribute(attString, CFRangeMake(0, CFAttributedStringGetLength(attString)), kCTKernAttributeName, (CFNumberRef)tkern); //apply individual kern attributes for (NSArray *kernLoc in kernArray) { NSLog(@"kern location: %i, %i", [[kernLoc objectAtIndex:0] intValue],[[kernLoc objectAtIndex:1] floatValue]); CFAttributedStringSetAttribute(attString, CFRangeMake([[kernLoc objectAtIndex:0] intValue], 1), kCTKernAttributeName, (CFNumberRef)[kernLoc objectAtIndex:1]); } CFAttributedStringReplaceAttributedString([self cfAttString], CFRangeMake(CFAttributedStringGetLength([self cfAttString]), 0), attString); //release CFRelease(attString); [kernArray release]; } ` The program always crashes because of line CFAttributedStringSetAttribute(attString, CFRangeMake([[kernLoc objectAtIndex:0] intValue], 1), kCTKernAttributeName, (CFNumberRef)[kernLoc objectAtIndex:1]) And it seems to depend on a few things: if [kernLoc objectAtIndex:1] refers to an [NSNumber numberWithFloat:k] where k = 0 (in other words, if k = 0 above where I populate kernArray) then the program crashes almost immediately If I comment out the line k = k + kKern, it takes longer for the program to crash, but does eventually (why would the crash depend on this value?) If I change the length of CFRangeMake from 1 to 0, it takes a lot longer for the program to crash, but still eventually does. (I don't think I am trying to access beyond the bounds of attString, but am I missing something?) When it crashes, I get something similar to: #0 0x942c7ed7 in objc_msgSend () #1 0x00000013 in ?? () #2 0x0285b827 in CFAttributedStringSetAttribute () #3 0x0000568f in op_TJ (scanner=0x472a590, info=0x4a32320) at /Users/Richard/Desktop/AppTest/PDFHighlight 2/PDFScannerOperators.m:251 Any ideas? It seems like somewhere along the way I am overwriting memory or trying to access memory that has been changed, but I have no idea. If there's anymore information I can provide, please let me know. Thanks, Richard

    Read the article

  • Problem with System.Diagnostics.Process RedirectStandardOutput to appear in Winforms Textbox in real

    - by Jonathan Websdale
    I'm having problems with the redirected output from a console application being presented in a Winforms textbox in real-time. The messages are being produced line by line however as soon as interaction with the Form is called for, nothing appears to be displayed. Following the many examples on both Stackoverflow and other forums, I can't seem to get the redirected output from the process to display in the textbox on the form until the process completes. By adding debug lines to the 'stringWriter_StringWritten' method and writing the redirected messages to the debug window I can see the messages arriving during the running of the process but these messages will not appear on the form's textbox until the process completes. Grateful for any advice on this. Here's an extract of the code public partial class RunExternalProcess : Form { private static int numberOutputLines = 0; private static MyStringWriter stringWriter; public RunExternalProcess() { InitializeComponent(); // Create the output message writter RunExternalProcess.stringWriter = new MyStringWriter(); stringWriter.StringWritten += new EventHandler(stringWriter_StringWritten); System.Diagnostics.ProcessStartInfo startInfo = new System.Diagnostics.ProcessStartInfo(); startInfo.FileName = "myCommandLineApp.exe"; startInfo.UseShellExecute = false; startInfo.RedirectStandardOutput = true; startInfo.CreateNoWindow = true; startInfo.WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden; using (var pProcess = new System.Diagnostics.Process()) { pProcess.StartInfo = startInfo; pProcess.OutputDataReceived += new System.Diagnostics.DataReceivedEventHandler(RunExternalProcess.Process_OutputDataReceived); pProcess.EnableRaisingEvents = true; try { pProcess.Start(); pProcess.BeginOutputReadLine(); pProcess.BeginErrorReadLine(); pProcess.WaitForExit(); } catch (Exception ex) { MessageBox.Show(ex.ToString()); } finally { pProcess.OutputDataReceived -= new System.Diagnostics.DataReceivedEventHandler(RunExternalProcess.Process_OutputDataReceived); } } } private static void Process_OutputDataReceived(object sender, System.Diagnostics.DataReceivedEventArgs e) { if (!string.IsNullOrEmpty(e.Data)) { RunExternalProcess.OutputMessage(e.Data); } } private static void OutputMessage(string message) { RunExternalProcess.stringWriter.WriteLine("[" + RunExternalProcess.numberOutputLines++.ToString() + "] - " + message); } private void stringWriter_StringWritten(object sender, EventArgs e) { System.Diagnostics.Debug.WriteLine(((MyStringWriter)sender).GetStringBuilder().ToString()); SetProgressTextBox(((MyStringWriter)sender).GetStringBuilder().ToString()); } private delegate void SetProgressTextBoxCallback(string text); private void SetProgressTextBox(string text) { if (this.ProgressTextBox.InvokeRequired) { SetProgressTextBoxCallback callback = new SetProgressTextBoxCallback(SetProgressTextBox); this.BeginInvoke(callback, new object[] { text }); } else { this.ProgressTextBox.Text = text; this.ProgressTextBox.Select(this.ProgressTextBox.Text.Length, 0); this.ProgressTextBox.ScrollToCaret(); } } } public class MyStringWriter : System.IO.StringWriter { // Define the event. public event EventHandler StringWritten; public MyStringWriter() : base() { } public MyStringWriter(StringBuilder sb) : base(sb) { } public MyStringWriter(StringBuilder sb, IFormatProvider formatProvider) : base(sb, formatProvider) { } public MyStringWriter(IFormatProvider formatProvider) : base(formatProvider) { } protected virtual void OnStringWritten() { if (StringWritten != null) { StringWritten(this, EventArgs.Empty); } } public override void Write(char value) { base.Write(value); this.OnStringWritten(); } public override void Write(char[] buffer, int index, int count) { base.Write(buffer, index, count); this.OnStringWritten(); } public override void Write(string value) { base.Write(value); this.OnStringWritten(); } }

    Read the article

  • gVIM "put" driving me mad, how do I "put" at the beginning of a line

    - by crgnz
    I'm learning gVIM on Windows, and as I slowly learn more of the keystrokes I find myself using the mouse less and less, which is great. I have a couple of questions I've yet to figure out: I do a lot of copy and paste. So I use 'v' to enter VISUAL mode, use k/j to move up/down and select the lines, then hit 'y' to yank. I then go to the line where I want to insert, and hit 'p' to put, BUT the darn thing pastes after the 1st character. I can't move any further left, so I am definitely at the start of the line, so I find the 'p'ut behaviour of pasting 1 char after my cursor position to be supremely annoying. I switch between edit and command mode an awful lot, and my poor little finger on my left hand is getting sore from being stretched out to hit the 'Esc' key (to enter command mode) every few seconds. Is there a more finger-friendly way to enter command mode?

    Read the article

  • home, end, delete, pageup, pagedown with ksh

    - by Nicolas
    Hello. I want to use home, end, delete, pageup, pagedown with ksh. My TERM is xterm-color. These keys works fine with tcsh and zsh, but not with ksh (print a tilda ~) I found this: bind '^[[3'=prefix-2 bind '^[[3~'=delete-char-forward bind '^[[1'=prefix-2 bind '^[[1~'=beginning-of-line bind '^[[4'=prefix-2 bind '^[[4~'=end-of-line But when I set one bindkey, the last does not work anymore. How can I use these keys in ksh with a .kshrc ? Thanks.

    Read the article

  • How is the MTU is 65535 in UDP but ethernet does not allow frame size more than 1500 bytes

    - by nikku
    I am using a fast ethernet of 100 Mbps, whose frame size is less than 1500 bytes (1472 bytes for payload as per my textbook). In that, I was able to send and receive a UDP packet of message size 65507 bytes, which means the packet size was 65507 + 20 (IP Header) + 8 (UDP Header) = 65535. If the frame's payload size itself is maximum of 1472 bytes (as per my textbook), how can the packet size of IP be greater than that which here is 65535? I used sender code as char buffer[100000]; for (int i = 1; i < 100000; i++) { int len = send (socket_id, buffer, i); printf("%d\n", len); } Receiver code as while (len = recv (socket_id, buffer, 100000)) { printf("%d\n". len); } I observed that send returns -1 on i > 65507 and recv prints or receives a packet of maximum of length 65507.

    Read the article

  • Active Desktop cannot be restored on Win XP

    - by Phil.Wheeler
    This is more of a major annoyance than anything that's stopping me from doing my work, but I somehow seem to have had Active Desktop on my work XP machine get corrupted and now can't get it back working again. I've tried browsing to C:\Documents and Settings\%my-user-name%\Application Data\Microsoft\Internet Explorer and changing, deleting or replacing the Desktop.htt file, but it's not achieving anything. The error I was previously getting when trying to click the "Restore my active desktop" button was: Internet Explorer Script Error An error has occurred in the script on this page. Line: 65 Char: 1 Error: Object doesn't support this action Code: 0 URL: file://C:/Documents%20and%20Settings//Application %20Data/Microsoft/Internet%20Explorer/Desktop.htt Any ideas?

    Read the article

  • How to read registry correctly for multiple values in c?

    - by kampi
    Hi! I created a .dll which should work like the RunAs command. The only difference is, that it should read from registry. My problem is, that i need to reed 3 values from the registry, but i can't. It reads the first, than it fails at the second one (Password) with error code 2, which means "The system cannot find the file specified". If i query only for domain and username then it is ok, if i query only for password then it it still succeeds, but if i want to query all three then it fails. Can someone tell me, what i am doing wrong? Heres my code: HKEY hKey = 0; DWORD dwType = REG_SZ; DWORD dwBufSize = sizeof(buf); TCHAR szMsg [MAX_PATH + 32]; HANDLE handle; LPVOID lpMsgBuf; if( RegOpenKeyEx( HKEY_CURRENT_USER, TEXT("SOFTWARE\\Kampi Corporation\\RunAs!"), 0, KEY_QUERY_VALUE, &hKey ) == ERROR_SUCCESS ) { if( RegQueryValueEx( hKey, TEXT("Username"), 0, &dwType, (LPBYTE)buf, &dwBufSize ) == ERROR_SUCCESS ) { memset( szMsg, 0, sizeof( szMsg ) ); wsprintf ( szMsg, _T("%s"), buf ); mbstowcs( wuser, szMsg, 255 ); RegCloseKey( hKey ); } else { MessageBox ( pCmdInfo->hwnd, "Can not query for Username key value!", _T("RunAs!"), MB_ICONERROR ); RegCloseKey( hKey ); return -1; } } else { CSimpleShlExt::showerror( GetLastError(), pCmdInfo->hwnd, "RegOpenKeyEx failed for Username with error code :: " ); return -1; } if( RegOpenKeyEx( HKEY_CURRENT_USER, TEXT("SOFTWARE\\Kampi Corporation\\RunAs!"), 0, KEY_QUERY_VALUE ,&hKey ) == ERROR_SUCCESS ) { if( RegQueryValueEx( hKey, TEXT("Password"), 0, &dwType, (LPBYTE)buf, &dwBufSize ) == ERROR_SUCCESS ) { memset( szMsg, 0, sizeof( szMsg ) ); wsprintf ( szMsg, _T("%s"), buf ); mbstowcs( wpass, szMsg, 255 ); RegCloseKey( hKey ); } else { char test[200]; sprintf(test,"Can not query for Password key value! EC: %d",GetLastError() ); MessageBox ( pCmdInfo->hwnd, test, _T("RunAs!"), MB_ICONERROR ); RegCloseKey( hKey ); return -1; } } else { CSimpleShlExt::showerror( GetLastError(), pCmdInfo->hwnd, "RegOpenKeyEx failed for Password with error code :: " ); return -1; } if( RegOpenKeyEx( HKEY_CURRENT_USER, TEXT("SOFTWARE\\Kampi Corporation\\RunAs!"), 0, KEY_QUERY_VALUE ,&hKey ) == ERROR_SUCCESS ) { if( RegQueryValueEx( hKey, TEXT("Domain"), 0, &dwType, (LPBYTE)buf, &dwBufSize ) == ERROR_SUCCESS ) { memset( szMsg, 0, sizeof( szMsg ) ); wsprintf ( szMsg, _T("%s"), buf ); mbstowcs( wdomain, szMsg, 255 ); RegCloseKey( hKey ); } else { char test[200]; sprintf(test,"Can not query for Password key value! EC: %d",GetLastError() ); MessageBox ( pCmdInfo->hwnd, test, _T("RunAs!"), MB_ICONERROR ); RegCloseKey( hKey ); return -1; } } else { CSimpleShlExt::showerror( GetLastError(), pCmdInfo->hwnd, "RegOpenKeyEx failed for Domain with error code :: " ); return -1; }

    Read the article

  • AT91SAM7X512's SPI peripheral gets disabled on write to SPI_TDR

    - by Dor
    My AT91SAM7X512's SPI peripheral gets disabled on the X time (X varies) that I write to SPI_TDR. As a result, the processor hangs on the while loop that checks the TDRE flag in SPI_SR. This while loop is located in the function SPI_Write() that belongs to the software package/library provided by ATMEL. The problem occurs arbitrarily - sometimes everything works OK and sometimes it fails on repeated attempts (attemp = downloading the same binary to the MCU and running the program). Configurations are (defined in the order of writing): SPI_MR: MSTR = 1 PS = 0 PCSDEC = 0 PCS = 0111 DLYBCS = 0 SPI_CSR[3]: CPOL = 0 NCPHA = 1 CSAAT = 0 BITS = 0000 SCBR = 20 DLYBS = 0 DLYBCT = 0 SPI_CR: SPIEN = 1 After setting the configurations, the code verifies that the SPI is enabled, by checking the SPIENS flag. I perform a transmission of bytes as follows: const short int dataSize = 5; // Filling array with random data unsigned char data[dataSize] = {0xA5, 0x34, 0x12, 0x00, 0xFF}; short int i = 0; volatile unsigned short dummyRead; SetCS3(); // NPCS3 == PIOA15 while(i-- < dataSize) { mySPI_Write(data[i]); while((AT91C_BASE_SPI0->SPI_SR & AT91C_SPI_TXEMPTY) == 0); dummyRead = SPI_Read(); // SPI_Read() from Atmel's library } ClearCS3(); /**********************************/ void mySPI_Write(unsigned char data) { while ((AT91C_BASE_SPI0->SPI_SR & AT91C_SPI_TXEMPTY) == 0); AT91C_BASE_SPI0->SPI_TDR = data; while ((AT91C_BASE_SPI0->SPI_SR & AT91C_SPI_TDRE) == 0); // <-- This is where // the processor hangs, because that the SPI peripheral is disabled // (SPIENS equals 0), which makes TDRE equal to 0 forever. } Questions: What's causing the SPI peripheral to become disabled on the write to SPI_TDR? Should I un-comment the line in SPI_Write() that reads the SPI_RDR register? Means, the 4th line in the following code: (The 4th line is originally marked as a comment) void SPI_Write(AT91S_SPI *spi, unsigned int npcs, unsigned short data) { // Discard contents of RDR register //volatile unsigned int discard = spi->SPI_RDR; /* Send data */ while ((spi->SPI_SR & AT91C_SPI_TXEMPTY) == 0); spi->SPI_TDR = data | SPI_PCS(npcs); while ((spi->SPI_SR & AT91C_SPI_TDRE) == 0); } Is there something wrong with the code above that transmits 5 bytes of data? Please note: The NPCS line num. 3 is a GPIO line (means, in PIO mode), and is not controlled by the SPI controller. I'm controlling this line by myself in the code, by de/asserting the ChipSelect#3 (NPCS3) pin when needed. The reason that I'm doing so is because that problems occurred while trying to let the SPI controller to control this pin. I didn't reset the SPI peripheral twice, because that the errata tells to reset it twice only if I perform a reset - which I don't do. Quoting the errata: If a software reset (SWRST in the SPI Control Register) is performed, the SPI may not work properly (the clock is enabled before the chip select.) Problem Fix/Workaround The SPI Control Register field, SWRST (Software Reset) needs to be written twice to be cor- rectly set. I noticed that sometimes, if I put a delay before the write to the SPI_TDR register (in SPI_Write()), then the code works perfectly and the communications succeeds. Useful links: AT91SAM7X Series Preliminary.pdf ATMEL software package/library spi.c from Atmel's library spi.h from Atmel's library An example of initializing the SPI and performing a transfer of 5 bytes is highly appreciated and helpful.

    Read the article

  • [Java] Cannot draw pixels

    - by Wilhelm
    Hello everyone. I want to print each digit of pi number as a colored pixel, so, I get na input, with the pi number, then parse it into a list, each node containing a digit (I know, I'll use an array later), but I never get this painted to screen... Can someone help me to see where I'm wrong? package edu.pi.view; import java.awt.Graphics; import java.awt.Image; import java.awt.image.MemoryImageSource; import java.io.BufferedReader; import java.io.File; import java.io.FileReader; import java.util.ArrayList; import java.util.List; import javax.swing.JFrame; import javax.swing.JPanel; public class Main extends JPanel { private static final long serialVersionUID = 6416932054834995251L; private static int pixels[]; private static List<Integer> pi; public static void readFile(String name) { File file = new File(name); BufferedReader reader = null; pi = new ArrayList<Integer>(); char[] digits; try { reader = new BufferedReader(new FileReader(file)); String text = null; while((text = reader.readLine()) != null) { digits = text.toCharArray(); for(char el : digits) if(el != ' ') pi.add(Character.getNumericValue(el)); } } catch (Exception e) { e.printStackTrace(); } } public void paint(Graphics gg) { readFile("c:\\pi.txt"); int h = 5; int w = 2; int color = 0xffffff; int digit; int i = 0; pixels = new int[w * h]; for (int y = 0; y < h; y++) { for (int x = 0; x < h; x++) { digit = pi.get(i); if(digit == 0) color = 0x000000; else if(digit == 1) color = 0x787878; else if(digit == 2) color = 0x008B00; else if(digit == 3) color = 0x00008B; else if(digit == 4) color = 0x008B8B; else if(digit == 5) color = 0x008B00; else if(digit == 6) color = 0xCDCD00; else if(digit == 7) color = 0xFF4500; else if(digit == 8) color = 0x8B0000; else if(digit == 9) color = 0xFF0000; pixels[i] = color; i++; } } Image art = createImage(new MemoryImageSource(w, h, pixels, 0, w)); gg.drawImage(art, 0, 0, this); } public static void main(String[] args) { JFrame frame = new JFrame(); frame.getContentPane().add(new Main()); frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); frame.setSize(300,300); frame.setVisible(true); } }

    Read the article

  • custom type face class by dinesh?

    - by dineshpeiris
    package typeface{ import flash.display.*; import flash.events.Event; import flash.filters.BitmapFilter; import flash.filters.BitmapFilterQuality; import flash.filters.BlurFilter; public class Main extends Sprite { private var typeSet:String="SEE > THINK > CREATE"; private var collectionSet:MovieClip; private var w:int = 1; public function Main():void { trace("start typeface application"); collectionSet = new MovieClip(); for (var n:int = 0; n < typeSet.length; n++) { var _x:int = 0 + (40 * n); var _y:int = 0; var Type:TypeCollector = new TypeCollector(_x, _y, stringToCharacter(typeSet, n), collectionSet); Type.addEventListener("action", actionHandler); } collectionSet.x = 100; collectionSet.y = (stage.stageHeight / 2) - 80; addChild(collectionSet); } private function actionHandler(event:Event):void { if (w == 16) { collectionSet.filters = [new BlurFilter(30, 30, BitmapFilterQuality.HIGH)]; removeChild(collectionSet); } w++; } public function stringToCharacter(str:String, n:int):String { if (str.length == 1) { return str; } return str.slice(n, n+1); } } } package typeface { import flash.display.*; import flash.events.Event; import flash.utils.Timer; import flash.events.TimerEvent; import flash.filters.BitmapFilter; import flash.filters.BitmapFilterQuality; import flash.filters.BlurFilter; import flash.events.EventDispatcher; public class TypeCollector extends EventDispatcher { private var TYPE_MC:typeMC; private var typeArray:Array = new Array("A", "B", "C", "D", "E", "F", "G", "H", "I", "J", "K", "L", "M", "N", "O", "P", "Q", "R", "S", "T", "U", "V", "W", "X", "Y", "Z", "1", "2", "3", "4", "5", "6", "7", "8", "9", "0", "<", ">"); private var character:String; private var num:int = 0; private var TypeTimer:Timer; private var _xNum:int; private var _yNum:int; private var movieClip:MovieClip; public function TypeCollector(_x:int, _y:int, char:String, movie:MovieClip) { var totalNum:int = typeArray.length; _xNum = _x; _yNum = _y; movieClip = movie; character = char; TypeTimer = new Timer(100, totalNum); TypeTimer.addEventListener("timer", TypeRoutTimer); TypeTimer.start(); } public function TypeRoutTimer(event:TimerEvent):void { CreateTypeFace(num, _xNum, _yNum, character); num++; } public function CreateTypeFace(num:int, _x:int, _y:int, character:String) { if (character == " ") { } else { if (TYPE_MC != null) { TYPE_MC.filters = [new BlurFilter(30, 30, BitmapFilterQuality.HIGH)]; movieClip.removeChild(TYPE_MC); } if (typeArray[num] == character) { TYPE_MC = new typeMC(); TYPE_MC.x = _x; TYPE_MC.y = _y; TYPE_MC.typeTF.text = typeArray[num]; TYPE_MC.filters = [new BlurFilter(5, 5, BitmapFilterQuality.HIGH)]; movieClip.addChild(TYPE_MC); dispatchEvent(new Event("action")); TypeTimer.stop(); } else { TYPE_MC = new typeMC(); TYPE_MC.x = _x; TYPE_MC.y = _y; TYPE_MC.typeTF.text = typeArray[num]; TYPE_MC.filters = [new BlurFilter(10, 10, BitmapFilterQuality.HIGH)]; movieClip.addChild(TYPE_MC); } } } } }

    Read the article

  • HP-UX: libstd_v2 in stack trace of JNI code compiled with g++

    - by Miguel Rentes
    Hello, uname -mr: B.11.23 ia64 g++ --version: g++ (GCC) 4.4.0 java -version: Java(TM) SE Runtime Environment (build 1.6.0.06-jinteg_20_jan_2010_05_50-b00) Java HotSpot(TM) Server VM (build 14.3-b01-jre1.6.0.06-rc1, mixed mode) I'm trying to run a Java application that uses JNI. It is crashing inside the JNI code with the following (abbreviated) stack trace: (0) 0xc0000000249353e0 VMError::report_and_die{_ZN7VMError14report_and_dieEv} + 0x440 at /CLO/Components/JAVA_HOTSPOT/Src/src/share/vm/utilities/vmError.cpp:738 [/opt/java6/jre/lib/IA64W/server/libjvm.so] (1) 0xc000000024559240 os::Hpux::JVM_handle_hpux_signal{_ZN2os4Hpux22JVM_handle_hpux_signalEiP9 __siginfoPvi} + 0x760 at /CLO/Components/JAVA_HOTSPOT/Src/src/os_cpu/hp-ux_ia64/vm/os_hp-ux_ia64.cpp:1051 [/opt/java6/jre/lib/IA64W/server/libjvm.so] (2) 0xc0000000245331c0 os::Hpux::signalHandler{_ZN2os4Hpux13signalHandlerEiP9__siginfoPv} + 0x80 at /CLO/Components/JAVA_HOTSPOT/Src/src/os/hp-ux/vm/os_hp-ux.cpp:4295 [/opt/java6/jre/lib/IA64W/server/libjvm.so] (3) 0xe00000010e002620 ---- Signal 11 (SIGSEGV) delivered ---- (4) 0xc0000000000d2d20 __pthread_mutex_lock + 0x400 at /ux/core/libs/threadslibs/src/common/pthreads/mutex.c:3895 [/usr/lib/hpux64/libpthread.so.1] (5) 0xc000000000342e90 __thread_mutex_lock + 0xb0 at ../../../../../core/libs/libc/shared_em_64/../core/threads/wrappers1.c:273 [/usr/lib/hpux64/libc.so.1] (6) 0xc00000000177dff0 _HPMutexWrapper::lock{_ZN15_HPMutexWrapper4lockEPv} + 0x90 [/usr/lib/hpux64/libstd_v2.so.1] (7) 0xc0000000017e9960 std::basic_string,std::allocator{_ZNSsC1ERKSs} + 0x80 [/usr/lib/hpux64/libstd_v2.so.1] (8) 0xc000000008fd9fe0 JniString::str{_ZNK9JniString3strEv} + 0x50 at eg_handler_jni.cxx:50 [/soft/bus-7_0/lib/libbus_registry_jni.so.7.0.0] (9) 0xc000000008fd7060 pt_efacec_se_aut_frk_cmp_registry_REGHandler::getKey{_ZN44pt_efacec_se_aut_frk_cmp_registry_REGHandler6getKeyEP8_jstringi} + 0xa0 [/soft/bus-7_0/lib/libbus_registry_jni.so.7.0.0] (10) 0xc000000008fd17f0 Java_pt_efacec_se_aut_frk_cmp_registry_REGHandler_getKey__Ljava_lang_String_2I + 0xa0 [/soft/bus-7_0/lib/libbus_registry_jni.so.7.0.0] (11) 0x9fffffffdf400ed0 Internal error (-3) while unwinding stack [/CLO/Components/JAVA_HOTSPOT/Src/src/os_cpu/hp-ux_ia64/vm/thread_hp-ux_ia64.cpp:142] This JNI code and dependencies are being compiled using g++, are multithreaded and 64 bit (-pthread -mlp64 -shared -fPIC). The LD_LIBRARY_PATH environment variable is set the dependencies location, and running ldd on the JNI shared libraries finds them all: ldd libbus_registry_jni.so: libefa-d.so.7 = /soft/bus-7_0/lib/libefa-d.so.7 libbus_registry-d.so.7 = /soft/bus-7_0/lib/libbus_registry-d.so.7 libboost_thread-gcc44-mt-d-1_41.so = /usr/local/lib/libboost_thread-gcc44-mt-d-1_41.so libboost_system-gcc44-mt-d-1_41.so = /usr/local/lib/libboost_system-gcc44-mt-d-1_41.so libboost_regex-gcc44-mt-d-1_41.so = /usr/local/lib/libboost_regex-gcc44-mt-d-1_41.so librt.so.1 = /usr/lib/hpux64/librt.so.1 libstdc++.so.6 = /opt/hp-gcc-4.4.0/lib/gcc/ia64-hp-hpux11.23/4.4.0/../../../hpux64/libstdc++.so.6 libm.so.1 = /usr/lib/hpux64/libm.so.1 libgcc_s.so.0 = /opt/hp-gcc-4.4.0/lib/gcc/ia64-hp-hpux11.23/4.4.0/../../../hpux64/libgcc_s.so.0 libunwind.so.1 = /usr/lib/hpux64/libunwind.so.1 librt.so.1 = /usr/lib/hpux64/librt.so.1 libm.so.1 = /usr/lib/hpux64/libm.so.1 libunwind.so.1 = /usr/lib/hpux64/libunwind.so.1 libdl.so.1 = /usr/lib/hpux64/libdl.so.1 libunwind.so.1 = /usr/lib/hpux64/libunwind.so.1 libc.so.1 = /usr/lib/hpux64/libc.so.1 libuca.so.1 = /usr/lib/hpux64/libuca.so.1 Looking at the stack trace, it seams odd that, although ldd list g++'s libstdc++ is being used, the std:string copy c'tor being reported as used is the one in libstd_v2, the implementation provided by aCC. The crash happens in the following code, when method str() returns: class JniString { std::string m_utf8; public: JniString(JNIEnv* env, jstring instance) { const char* utf8Chars = env-GetStringUTFChars(instance, 0); if (utf8Chars == 0) { env-ExceptionClear(); // RPF throw std::runtime_error("GetStringUTFChars returned 0"); } m_utf8.assign(utf8Chars); env-ReleaseStringUTFChars(instance, utf8Chars); } std::string str() const { return m_utf8; } }; Simultaneous usage of the two C++ implementations could likely be a reason for the crash, but that should not be happening. Any ideas?

    Read the article

  • Unable to modify variables phpmyadmin via variables tab (Xampp)

    - by rookie coder
    I am quite new to phpmyadmin configuration. I had project where utf8 encoding is needed. What i'm trying to do is to change the variables text/char all into utf8. I changed, yes at that moment the values changed into values I wanted. But then when I terminate Xampp and reenters phpmyadmin page or even refreshing the page, all the values restored to default (original values). My phpmyadmin had default user as root and hadn't been set a password yet. There is also no logout button in phpmyadmin landing page. I had difficult time even to set the server connection collation (hangs indefinitely and never seems can be updated). phpmyadmin version:4.1.6 mysql:5.5.36 (latest version) I doubt this could be due to malformed installation, because same things happened in my other computer too (exactly the same versions). what could be wrong? thanks.

    Read the article

  • How do I go about overloading C++ operators to allow for chaining?

    - by fneep
    I, like so many programmers before me, am tearing my hair out writing the right-of-passage-matrix-class-in-C++. I have never done very serious operator overloading and this is causing issues. Essentially, by stepping through This is what I call to cause the problems. cMatrix Kev = CT::cMatrix::GetUnitMatrix(4, true); Kev *= 4.0f; cMatrix Baz = Kev; Kev = Kev+Baz; //HERE! What seems to be happening according to the debugger is that Kev and Baz are added but then the value is lost and when it comes to reassigning to Kev, the memory is just its default dodgy values. How do I overload my operators to allow for this statement? My (stripped down) code is below. //header class cMatrix { private: float* _internal; UInt32 _r; UInt32 _c; bool _zeroindexed; //fast, assumes zero index, no safety checks float cMatrix::_getelement(UInt32 r, UInt32 c) { return _internal[(r*this->_c)+c]; } void cMatrix::_setelement(UInt32 r, UInt32 c, float Value) { _internal[(r*this->_c)+c] = Value; } public: cMatrix(UInt32 r, UInt32 c, bool IsZeroIndexed); cMatrix( cMatrix& m); ~cMatrix(void); //operators cMatrix& operator + (cMatrix m); cMatrix& operator += (cMatrix m); cMatrix& operator = (const cMatrix &m); }; //stripped source file cMatrix::cMatrix(cMatrix& m) { _r = m._r; _c = m._c; _zeroindexed = m._zeroindexed; _internal = new float[_r*_c]; UInt32 size = GetElementCount(); for (UInt32 i = 0; i < size; i++) { _internal[i] = m._internal[i]; } } cMatrix::~cMatrix(void) { delete[] _internal; } cMatrix& cMatrix::operator+(cMatrix m) { return cMatrix(*this) += m; } cMatrix& cMatrix::operator*(float f) { return cMatrix(*this) *= f; } cMatrix& cMatrix::operator*=(float f) { UInt32 size = GetElementCount(); for (UInt32 i = 0; i < size; i++) { _internal[i] *= f; } return *this; } cMatrix& cMatrix::operator+=(cMatrix m) { if (_c != m._c || _r != m._r) { throw new cCTException("Cannot add two matrix classes of different sizes."); } if (!(_zeroindexed && m._zeroindexed)) { throw new cCTException("Zero-Indexed mismatch."); } for (UInt32 row = 0; row < _r; row++) { for (UInt32 column = 0; column < _c; column++) { float Current = _getelement(row, column) + m._getelement(row, column); _setelement(row, column, Current); } } return *this; } cMatrix& cMatrix::operator=(const cMatrix &m) { if (this != &m) { _r = m._r; _c = m._c; _zeroindexed = m._zeroindexed; delete[] _internal; _internal = new float[_r*_c]; UInt32 size = GetElementCount(); for (UInt32 i = 0; i < size; i++) { _internal[i] = m._internal[i]; } } return *this; }

    Read the article

  • Is ServerManager module available on Windows 7?

    - by ruslan
    Is ServerManager module available on Windows 7 Ultimate ? I think (but I'm not sure because I never looked at output before) the PS script that I have worked before but after some problems with IIS7 installation it stopped working. Following script import-module servermanager fails with error Import-Module : The specified module 'servermanager' was not loaded because no valid module file was found in any module directory. At line:1 char:14 + import-module <<<< servermanager + CategoryInfo : ResourceUnavailable: (servermanager:String) [Import-Module], FileNotFoundException + FullyQualifiedErrorId : Modules_ModuleNotFound,Microsoft.PowerShell.Commands.ImportModuleCommand I found recommendation to run Dism.exe /Online /Enable-Feature /FeatureName:ServerManager-PSH-Cmdlets on my machine but it also fails with error Feature name ServerManager-PSH-Cmdlets is unknown.

    Read the article

  • BasicAuthProvider in ServiceStack

    - by Per
    I've got an issue with the BasicAuthProvider in ServiceStack. POST-ing to the CredentialsAuthProvider (/auth/credentials) is working fine. The problem is that when GET-ing (in Chrome): http://foo:pwd@localhost:81/tag/string/list the following is the result Handler for Request not found: Request.HttpMethod: GET Request.HttpMethod: GET Request.PathInfo: /login Request.QueryString: System.Collections.Specialized.NameValueCollection Request.RawUrl: /login?redirect=http%3a%2f%2flocalhost%3a81%2ftag%2fstring%2flist which tells me that it redirected me to /login instead of serving the /tag/... request. Here's the entire code for my AppHost: public class AppHost : AppHostHttpListenerBase, IMessageSubscriber { private ITagProvider myTagProvider; private IMessageSender mySender; private const string UserName = "foo"; private const string Password = "pwd"; public AppHost( TagConfig config, IMessageSender sender ) : base( "BM App Host", typeof( AppHost ).Assembly ) { myTagProvider = new TagProvider( config ); mySender = sender; } public class CustomUserSession : AuthUserSession { public override void OnAuthenticated( IServiceBase authService, IAuthSession session, IOAuthTokens tokens, System.Collections.Generic.Dictionary<string, string> authInfo ) { authService.RequestContext.Get<IHttpRequest>().SaveSession( session ); } } public override void Configure( Funq.Container container ) { Plugins.Add( new MetadataFeature() ); container.Register<BeyondMeasure.WebAPI.Services.Tags.ITagProvider>( myTagProvider ); container.Register<IMessageSender>( mySender ); Plugins.Add( new AuthFeature( () => new CustomUserSession(), new AuthProvider[] { new CredentialsAuthProvider(), //HTML Form post of UserName/Password credentials new BasicAuthProvider(), //Sign-in with Basic Auth } ) ); container.Register<ICacheClient>( new MemoryCacheClient() ); var userRep = new InMemoryAuthRepository(); container.Register<IUserAuthRepository>( userRep ); string hash; string salt; new SaltedHash().GetHashAndSaltString( Password, out hash, out salt ); // Create test user userRep.CreateUserAuth( new UserAuth { Id = 1, DisplayName = "DisplayName", Email = "[email protected]", UserName = UserName, FirstName = "FirstName", LastName = "LastName", PasswordHash = hash, Salt = salt, }, Password ); } } Could someone please tell me what I'm doing wrong with either the SS configuration or how I am calling the service, i.e. why does it not accept the supplied user/pwd? Update1: Request/Response captured in Fiddler2when only BasicAuthProvider is used. No Auth header sent in the request, but also no Auth header in the response. GET /tag/string/AAA HTTP/1.1 Host: localhost:81 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA== HTTP/1.1 302 Found Location: /login?redirect=http%3a%2f%2flocalhost%3a81%2ftag%2fstring%2fAAA Server: Microsoft-HTTPAPI/2.0 X-Powered-By: ServiceStack/3,926 Win32NT/.NET Date: Sat, 10 Nov 2012 22:41:51 GMT Content-Length: 0 Update2 Request/Response with HtmlRedirect = null . SS now answers with the Auth header, which Chrome then issues a second request for and authentication succeeds GET http://localhost:81/tag/string/Abc HTTP/1.1 Host: localhost:81 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA== HTTP/1.1 401 Unauthorized Transfer-Encoding: chunked Server: Microsoft-HTTPAPI/2.0 X-Powered-By: ServiceStack/3,926 Win32NT/.NET WWW-Authenticate: basic realm="/auth/basic" Date: Sat, 10 Nov 2012 22:49:19 GMT 0 GET http://localhost:81/tag/string/Abc HTTP/1.1 Host: localhost:81 Connection: keep-alive Authorization: Basic Zm9vOnB3ZA== User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA==

    Read the article

  • Parsing a Directory of files - Check for a String

    - by i.h4d35
    This is my first post here so kindly pardon any mistakes that I have. I'm still learning to find my way around Stack Exchange. I am trying to write a Java program that tries to scan a Directory full of either .txt,.rtf or.doc files(and none other). The aim is to search all the files in the directory, and find out if a particular string exists in the file. If it does, it returns the string and the filename that it found the string in. The aim of this program is, it is a project for school wherein the program scans the personal folders of call center employees to check if they have stored any CC/DC nos and if yes, report the folder name - to reduce CC fraud. The search function was fairly straight forward and works when I individually specify the filename. However, the searching the directory and passing the files to the search function has me stumped. I've posted my code so far, if you guys could look thru it and give me some feedback/suggestions, I'd really appreciate it. Thanks in advance import java.io.*; import java.util.*; public class parse2{ void traverse(String directory) throws FileNotFoundException { File dir = new File(directory); if (dir.isDirectory()) { String[] children = dir.list(); for (int i=0; i<children.length; i++) { //System.out.println("\n" + children[i]); reader(children[i]); } } } void reader(String loc) throws FileNotFoundException { FileReader fr = new FileReader(loc); BufferedReader br = new BufferedReader(fr); Scanner sc = new Scanner(br); char[] chkArray; int chk=1; char ch; while(sc.hasNext()) { String chkStr = sc.next(); chkArray = chkStr.toCharArray(); if ((chkArray[0]=='4')&&(chkStr.length()>13)) { for(int i=0;i<chkArray.length;i++) { ch=chkArray[i]; if((ch=='0')||(ch=='1')||(ch=='2')||(ch=='3')||(ch=='4')||(ch=='5')||(ch=='6')||(ch=='7')||(ch=='8')||(ch=='9')) { chk=0; continue; } else { chk=1; break; } } if(chk==0) System.out.println("\n"+ chkStr); } else if((chkArray[0]=='5')&&(chkStr.length()>13)) { for(int i=0;i<chkArray.length;i++) { ch=chkArray[i]; if((ch=='0')||(ch=='1')||(ch=='2')||(ch=='3')||(ch=='4')||(ch=='5')||(ch=='6')||(ch=='7')||(ch=='8')||(ch=='9')) { chk=0; continue; } else { chk=1; break; } } if(chk==0) System.out.println("\n"+ chkStr); } else if((chkArray[0]=='6')&&(chkStr.length()>13)) { for(int i=0;i<chkArray.length;i++) { ch=chkArray[i]; if((ch=='0')||(ch=='1')||(ch=='2')||(ch=='3')||(ch=='4')||(ch=='5')||(ch=='6')||(ch=='7')||(ch=='8')||(ch=='9')) { chk=0; continue; } else { chk=1; break; } } if(chk==0) System.out.println("\n"+ chkStr); } } } public static void main(String args[]) throws FileNotFoundException { parse2 P = new parse2(); P.traverse("C:/Documents and Settings/h4d35/Desktop/javatest/chk"); } }

    Read the article

  • Clipplanes, vertex shaders and hardware vertex processing in Direct3D 9

    - by Igor
    Hi, I have an issue with clipplanes in my application that I can reproduce in a sample from DirectX SDK (February 2010). I added a clipplane to the HLSLwithoutEffects sample: ... D3DXPLANE g_Plane( 0.0f, 1.0f, 0.0f, 0.0f ); ... void SetupClipPlane(const D3DXMATRIXA16 & view, const D3DXMATRIXA16 & proj) { D3DXMATRIXA16 m = view * proj; D3DXMatrixInverse( &m, NULL, &m ); D3DXMatrixTranspose( &m, &m ); D3DXPLANE plane; D3DXPlaneNormalize( &plane, &g_Plane ); D3DXPLANE clipSpacePlane; D3DXPlaneTransform( &clipSpacePlane, &plane, &m ); DXUTGetD3D9Device()->SetClipPlane( 0, clipSpacePlane ); } void CALLBACK OnFrameMove( double fTime, float fElapsedTime, void* pUserContext ) { // Update the camera's position based on user input g_Camera.FrameMove( fElapsedTime ); // Set up the vertex shader constants D3DXMATRIXA16 mWorldViewProj; D3DXMATRIXA16 mWorld; D3DXMATRIXA16 mView; D3DXMATRIXA16 mProj; mWorld = *g_Camera.GetWorldMatrix(); mView = *g_Camera.GetViewMatrix(); mProj = *g_Camera.GetProjMatrix(); mWorldViewProj = mWorld * mView * mProj; g_pConstantTable->SetMatrix( DXUTGetD3D9Device(), "mWorldViewProj", &mWorldViewProj ); g_pConstantTable->SetFloat( DXUTGetD3D9Device(), "fTime", ( float )fTime ); SetupClipPlane( mView, mProj ); } void CALLBACK OnFrameRender( IDirect3DDevice9* pd3dDevice, double fTime, float fElapsedTime, void* pUserContext ) { // If the settings dialog is being shown, then // render it instead of rendering the app's scene if( g_SettingsDlg.IsActive() ) { g_SettingsDlg.OnRender( fElapsedTime ); return; } HRESULT hr; // Clear the render target and the zbuffer V( pd3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_ARGB( 0, 45, 50, 170 ), 1.0f, 0 ) ); // Render the scene if( SUCCEEDED( pd3dDevice->BeginScene() ) ) { pd3dDevice->SetVertexDeclaration( g_pVertexDeclaration ); pd3dDevice->SetVertexShader( g_pVertexShader ); pd3dDevice->SetStreamSource( 0, g_pVB, 0, sizeof( D3DXVECTOR2 ) ); pd3dDevice->SetIndices( g_pIB ); pd3dDevice->SetRenderState( D3DRS_CLIPPLANEENABLE, D3DCLIPPLANE0 ); V( pd3dDevice->DrawIndexedPrimitive( D3DPT_TRIANGLELIST, 0, 0, g_dwNumVertices, 0, g_dwNumIndices / 3 ) ); pd3dDevice->SetRenderState( D3DRS_CLIPPLANEENABLE, 0 ); RenderText(); V( g_HUD.OnRender( fElapsedTime ) ); V( pd3dDevice->EndScene() ); } } When I rotate the camera I have different visual results when using hardware and software vertex processing. In software vertex processing mode or when using the reference device the clipping plane works fine as expected. In hardware mode it seems to rotate with the camera. If I remove the call to RenderText(); from OnFrameRender then hardware rendering also works fine. Further debugging reveals that the problem is in ID3DXFont::DrawText. I have this issue in Windows Vista and Windows 7 but not in Windows XP. I tested the code with the latest NVidia and ATI drivers in all three OSes on different PCs. Is it a DirectX issue? Or incorrect usage of clipplanes? Thanks Igor

    Read the article

  • boost::serialization with mutable members

    - by redmoskito
    Using boost::serialization, what's the "best" way to serialize an object that contains cached, derived values in mutable members, such that cached members aren't serialized, but on deserialization, they are initialized the their appropriate default. A definition of "best" follows later, but first an example: class Example { public: Example(float n) : num(n), sqrt_num(-1.0) {} float get_num() const { return num; } // compute and cache sqrt on first read float get_sqrt() const { if(sqrt_num < 0) sqrt_num = sqrt(num); return sqrt_num; } template <class Archive> void serialize(Archive& ar, unsigned int version) { ... } private: float num; mutable float sqrt_num; }; On serialization, only the "num" member should be saved. On deserialization, the sqrt_num member must be initialized to its sentinel value indicating it needs to be computed. What is the most elegant way to implement this? In my mind, an elegant solution would avoid splitting serialize() into separate save() and load() methods (which introduces maintenance problems). One possible implementation of serialize: template <class Archive> void serialize(Archive& ar, unsigned int version) { ar & num; sqrt_num = -1.0; } This handles the deserialization case, but in the serialization case, the cached value is killed and must be recomputed. Also, I've never seen an example of boost::serialize that explicitly sets members inside of serialize(), so I wonder if this is generally not recommended. Some might suggest that the default constructor handles this, for example: int main() { Example e; { std::ifstream ifs("filename"); boost::archive::text_iarchive ia(ifs); ia >> e; } cout << e.get_sqrt() << endl; return 0; } which works in this case, but I think fails if the object receiving the deserialized data has already been initialized, as in the example below: int main() { Example ex1(4); Example ex2(9); cout << ex1.get_sqrt() << endl; // outputs 2; cout << ex2.get_sqrt() << endl; // outputs 3; // the following two blocks should implement ex2 = ex1; // save ex1 to archive { std::ofstream ofs("filename"); boost::archive::text_oarchive oa(ofs); oa << ex1; } // read it back into ex2 { std::ifstream ifs("filename"); boost::archive::text_iarchive ia(ifs); ia >> ex2; } // these should be equal now, but aren't, // since Example::serialize() doesn't modify num_sqrt cout << ex1.get_sqrt() << endl; // outputs 2; cout << ex2.get_sqrt() << endl; // outputs 3; return 0; } I'm sure this issue has come up with others, but I have struggled to find any documentation on this particular scenario. Thanks!

    Read the article

  • how to redirect terminal contents to a jtextpane?

    - by sonu thomas
    hi.. I was trying to run a java class file using java code.The aim was to direct the executing sequence of the terminal of fedora 10 into a frame with a textpane. My code is: import java.io.DataInputStream; import java.io.IOException; import javax.swing.JDialog; import javax.swing.JFrame; import javax.swing.JOptionPane; import javax.swing.JPanel; import javax.swing.JTextArea; import javax.swing.JTextPane; //import sun.reflect.ReflectionFactory.GetReflectionFactoryAction; public class file { /** * @param args */ /** * @param args * @throws IOException */ public static void main(String[] args) throws IOException{ // TODO Auto-generated method stub JDialog jj = new JDialog(); jj.setTitle("helllo"); String sssub1,sssub2,sssub3; String ss="C:/Users/sonu/Desktop/ourIDE/src/src/nn.java"; JPanel n = new JPanel(); jj.setContentPane(n); JTextPane tpn2=new JTextPane(); jj.getContentPane().add(tpn2); jj.setVisible(true); Runtime runtime; Process process; if(ss.indexOf(" ")==-1) { try { runtime= Runtime.getRuntime(); sssub1="/home/ss/Desktop/src/"; sssub2="nn"; process=runtime.exec("sh jrun.sh "+sssub1+" "+sssub2); DataInputStream data=new DataInputStream(process.getInputStream()); DataInputStream data_data=new DataInputStream(process.getErrorStream()); String s="",t=""; int ch; while((ch=data.read())!=-1){ s=s+(char)ch; } data.close(); while((ch=data_data.read())!=-1){ t=t+(char)ch; } data_data.close(); if(t.equals("")) { s+="\nNormal Termination."; tpn2.setText(s); } else tpn2.setText(t); }catch(Exception e){ System.out.println("Error executing file==>"+e);} } } } The content of **jrun.sh** is cd $1 java $2 When the content of **nn.java** was this: import java.io.*; class nn { public static void main(String[] ar)throws IOException { int i=90; BufferedReader dt = new BufferedReader(new InputStreamReader(System.in)); System.out.println("Enter"); //i=Integer.parseInt(dt.readLine()); line--notable System.out.println("i="+i); } } It worked smoothly But when I remove the comment on line--notable,it gave me Textpane with no content. The problem is : I cant read an input from nn.java Kindly give me a solution... If i am able to: get the terminal pop up with executing the nn.class,and i am able to enter the input,then it will do... Thanks in advance...

    Read the article

  • Unable to launch onscreen keyboard (osk.exe) from a 32-bit process on Win7 x64

    - by Steven Robbins
    90% of the time I am unable to launch osk.exe from a 32bit process on Win7 x64. Originally the code was just using: Process.Launch("osk.exe"); Which won't work on x64 because of the directory virtualization. Not a problem I thought, I'll just disable virtualization, launch the app, and enable it again, which I thought was the correct way to do things. I also added some code to bring the keyboard back up if it has been minimized (which works fine) - the code (in a sample WPF app) now looks as follows: using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Windows; using System.Windows.Controls; using System.Windows.Data; using System.Windows.Documents; using System.Windows.Input; using System.Windows.Media; using System.Windows.Media.Imaging; using System.Windows.Navigation;using System.Diagnostics; using System.Runtime.InteropServices; namespace KeyboardTest { /// <summary> /// Interaction logic for MainWindow.xaml /// </summary> public partial class MainWindow : Window { [DllImport("kernel32.dll", SetLastError = true)] private static extern bool Wow64DisableWow64FsRedirection(ref IntPtr ptr); [DllImport("kernel32.dll", SetLastError = true)] public static extern bool Wow64RevertWow64FsRedirection(IntPtr ptr); private const UInt32 WM_SYSCOMMAND = 0x112; private const UInt32 SC_RESTORE = 0xf120; [DllImport("user32.dll", CharSet = CharSet.Auto)] static extern IntPtr SendMessage(IntPtr hWnd, UInt32 Msg, IntPtr wParam, IntPtr lParam); private string OnScreenKeyboadApplication = "osk.exe"; public MainWindow() { InitializeComponent(); } private void KeyboardButton_Click(object sender, RoutedEventArgs e) { // Get the name of the On screen keyboard string processName = System.IO.Path.GetFileNameWithoutExtension(OnScreenKeyboadApplication); // Check whether the application is not running var query = from process in Process.GetProcesses() where process.ProcessName == processName select process; var keyboardProcess = query.FirstOrDefault(); // launch it if it doesn't exist if (keyboardProcess == null) { IntPtr ptr = new IntPtr(); ; bool sucessfullyDisabledWow64Redirect = false; // Disable x64 directory virtualization if we're on x64, // otherwise keyboard launch will fail. if (System.Environment.Is64BitOperatingSystem) { sucessfullyDisabledWow64Redirect = Wow64DisableWow64FsRedirection(ref ptr); } // osk.exe is in windows/system folder. So we can directky call it without path using (Process osk = new Process()) { osk.StartInfo.FileName = OnScreenKeyboadApplication; osk.Start(); osk.WaitForInputIdle(2000); } // Re-enable directory virtualisation if it was disabled. if (System.Environment.Is64BitOperatingSystem) if (sucessfullyDisabledWow64Redirect) Wow64RevertWow64FsRedirection(ptr); } else { // Bring keyboard to the front if it's already running var windowHandle = keyboardProcess.MainWindowHandle; SendMessage(windowHandle, WM_SYSCOMMAND, new IntPtr(SC_RESTORE), new IntPtr(0)); } } } } But this code, most of the time, throws the following exception on osk.Start(): The specified procedure could not be found at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) I've tried putting long Thread.Sleep commands in around the osk.Start line, just to make sure it wasn't a race condition, but the same problem persists. Can anyone spot where I'm doing something wrong, or provide an alternative solution for this? It seems to work fine launching Notepad, it just won't play ball with the onscreen keyboard.

    Read the article

  • How to read URDU from url and show it to your mobile using java Me

    - by Basit
    HI, Hope you all will be fine. Actually I m facing a problem. Actually i am using Google translation API. What my application does it connect to CGI-script, i pass value to it using GET then the CGI script connect to Google API, Translate the mesage from english to Urdu and then i retreive it.Here is the code [Java] import java.io.*; import javax.microedition.io.*; import javax.microedition.lcdui.*; import javax.microedition.midlet.*; /** * An example MIDlet to invoke a CGI script (GET method). */ public class InvokeCgiMidlet1 extends MIDlet { private Display display; String url = "http://xxx.xxx.xx.xxx/cgi-bin/api/GT/GT_Send_Msg.cgi?message=my%20name%20is%20basit"; public InvokeCgiMidlet1() { display = Display.getDisplay(this); } /** * Initialization. Invoked when we activate the MIDlet. */ public void startApp() { try { getGrade(url); } catch (IOException e) { System.out.println("IOException " + e); e.printStackTrace(); } } /** * Pause, discontinue .... */ public void pauseApp() { } /** * Destroy must cleanup everything. */ public void destroyApp(boolean unconditional) { } /** * Retrieve a grade.... */ void getGrade(String url) throws IOException { HttpConnection c = null; InputStream is = null; OutputStream os = null; StringBuffer b = new StringBuffer(); TextBox t = null; String response; try { c = (HttpConnection)Connector.open(url, Connector.READ_WRITE); c.setRequestMethod(HttpConnection.GET); c.setRequestProperty("User-Agent","Profile/MIDP-1.0 Confirguration/CLDC-1.0"); c.setRequestProperty("Content-Type","application/x-www-form-urlencoded"); c.setRequestProperty("User-Agent","Profile/MIDP-2.0 Configuration/CLDC-1.1"); c.setRequestProperty("Accept-Charset","UTF-8;q=0.7,*;q=0.7"); c.setRequestProperty("Accept-Encoding","gzip, deflate"); c.setRequestProperty("Accept","text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5"); c.setRequestProperty("Content-Language", "en-EN"); os = c.openOutputStream(); Reader r = new InputStreamReader(c.openDataInputStream(), "UTF-8"); int ch; while ((ch = r.read()) != -1) { b.append((char) ch ); //System.out.println((char)ch + "->" + ch + "->" + ch); } t = new TextBox("Final Grades", b.reverse().toString(), 1024, 0); } finally { if(is!= null) { is.close(); } if(os != null) { os.close(); } if(c != null) { c.close(); } } display.setCurrent(t); } } [/Java] The problem is as i told you that the translated text is in Urdu. So when it appear on screen each character is separate like this. ? ?? ? ? ?? . Because i read character by character I want it to appear in proper form like this ??????? So how can i do this. Is there font rendering required. If yes then how can i do it or any other method please hep me. Thanks

    Read the article

  • Delphi 2010 differs in Canvas transparency compared to Delphi 7?

    - by Tom1952
    I'm porting some very old code from Delph7 to Delphi2010 with a few changes as possible to the existing code base for the usual reasons. First: the good news for anyone who hasn't jumped yet: it's not as daunting as it may look! I'm actually pleased (& surprised) at how easy 1,000,000+ lines of code have moved across. And what a relief to be back on the leading edge! Delphi 2010 has so many great enhancements. However, I'm having a cosmetic problem with some TStringGrids and TDbGrids descendants. In the last century (literally!) someone wrote the two methods below. The first method is used to justify text. When run in Delphi 2010, the new text and the unjustified text to both appear in the cells written to. Of course it's a mess visually, almost illegible. Sometimes, as a result of the second method is use, the grid cells are actually semi-transparent, with text from the window below showing through. (Again, not pretty!) It appears to me that Delphi 2010's TDbGrid and TStringGrid have some differences in the way they handle transparency? I haven't much experience in this area of Delphi (in fact, I have no idea what the 2nd method is actually doing!) and was hoping someone could give me some pointers on what's going on and how to fix it. TIA! Method 1 procedure TForm1.gridDrawCell(Sender: TObject; Col, Row: Integer; Rect: TRect; State: TGridDrawState); {Used to align text in cells.} var x: integer; begin if (Row > 0) AND (Col > 0) then begin SetTextAlign(grdTotals.Canvas.Handle, TA_RIGHT); x := Rect.Right - 2; end else begin SetTextAlign(grdTotals.Canvas.Handle, TA_CENTER); x := (Rect.Left + Rect.Right) div 2; end; grdTotals.Canvas.TextRect(Rect, x, Rect.Top+2, grdTotals.Cells[Col,Row]); end; Method 2 procedure WriteText(ACanvas: TCanvas; ARect: TRect; DX, DY: Integer; const Text: string; TitleBreak: TTitleBreak; Alignment: TAlignment); const AlignFlags: array [TAlignment] of Integer = (DT_LEFT or { DT_WORDBREAK or } DT_EXPANDTABS or DT_NOPREFIX, DT_RIGHT or { DT_WORDBREAK or } DT_EXPANDTABS or DT_NOPREFIX, DT_CENTER or { DT_WORDBREAK or } DT_EXPANDTABS or DT_NOPREFIX); var ABitmap: TBitmap; AdjustBy: Integer; B, R: TRect; WordBreak: Integer; begin WordBreak := 0; if (TitleBreak = tbAlways) or ((TitleBreak = tbDetect) and (Pos(Chr(13) + Chr(10), Text) = 0)) then WordBreak := DT_WORDBREAK; ABitmap := TBitmap.Create; try ABitmap.Canvas.Lock; try AdjustBy := 1; if (Alignment = taRightJustify) then Inc(AdjustBy); with ABitmap, ARect do begin Width := Max(Width, Right - Left); Height := Max(Height, Bottom - Top); R := Rect(DX, DY, Right - Left - AdjustBy, Bottom - Top - 1); { @@@ } B := Rect(0, 0, Right - Left, Bottom - Top); end; with ABitmap.Canvas do begin Font := ACanvas.Font; Brush := ACanvas.Brush; Brush.Style := bsSolid; FillRect(B); SetBkMode(Handle, TRANSPARENT); DrawText(Handle, PChar(Text), Length(Text), R, AlignFlags[Alignment] or WordBreak); end; ACanvas.CopyRect(ARect, ABitmap.Canvas, B); finally ABitmap.Canvas.Unlock; end; finally ABitmap.Free; end; end;

    Read the article

  • sed works properly in SSH, not in PHP

    - by David
    So, I have the following line that I run in PHP with exec($addPHPtags); $addPHPtags = "/bin/sed -i '/<BODY BGCOLOR=\"#FFFFFF\">/ a\ <?php \n ?> '" . $instance['file'] . " 2>&1"; I'd expect that command to find the key and append it with a php tag. However, when I run it in PHP, I get the following error if I trap command output: [0] => /bin/sed: -e expression #1, char 39: unknown command: `?' However, if I run the same command in SSH, it works completely fine: /bin/sed -i '/<BODY BGCOLOR=\"#FFFFFF\">/ a\ <?php \n ?>' file.php I'm out of ideas, I've tried various alternatives but to no avail. Any help ? Thanks.

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >