Search Results

Search found 3768 results on 151 pages for 'lite byte'.

Page 113/151 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • Is there a concise way to create an InputSupplier for an InputStream in Google Guava?

    - by Fabian Steeg
    There are a few factory methods in Google Guava to create InputSuppliers, e.g. from a byte[]: ByteStreams.newInputStreamSupplier(bytes); Or from a File: Files.newInputStreamSupplier(file); Is there a similar way to to create an InputSupplier for a given InputStream? That is, a way that's more concise than an anonymous class: new InputSupplier<InputStream>() { public InputStream getInput() throws IOException { return inputStream; } }; Background: I'd like to use InputStreams with e.g. Files.copy(...) or ByteStreams.equal(...).

    Read the article

  • Python - Finding unicode/ascii problems

    - by user330739
    Hi all, I am csv.reader to pull in info from a very long sheet. I am doing work on that data set and then I am using the xlwt package to give me a workable excel file. However, I get this error: UnicodeDecodeError: 'ascii' codec can't decode byte 0x92 in position 34: ordinal not in range(128) My question to you all is, how can I find exactly where that error is in my data set? Also, is there some code that I can write which will look through my data set and find out where the issues lie (because some data sets run without the above error and others have problems)?

    Read the article

  • How to receive a datastream from a device on your computer, in C#

    - by WebDevHobo
    I plan to build a small audio-recorder app in C#. My laptop has a built in Microphone that's always active, so I want to use that as an early-stage test. I would simply start recording, save the file as a .wav or even use the LAME dll to make it into an MP3. The problem is, I don't know how to contact that microphone. Do I use a library that can detect a device, or do I just catch a stream of bytes from the port that the device is on? I don't have any experience with receiving data from connected devices. I suppose that I'll need to enter all the data into a byte array and then Serialize that into a WAV file, but I'm not sure. Can I get some pointers on this subject?

    Read the article

  • CryptGenRandom to generate asp.net session id

    - by DoDo
    Hi! does anyone have working example of CryptGenrRandom class to generate session id (need to use in my iis module). HCRYPTPROV hCryptProv; BYTE pbData[16]; if(CryptAcquireContext( &hCryptProv, NULL, NULL, PROV_RSA_FULL, CRYPT_VERIFYCONTEXT)) { if(CryptGenRandom(hCryptProv, 8, pbData)) { std::string s(( const char *) pbData); printf(s.c_str()); } else { MyHandleError("Error during CryptGenRandom."); } } else { MyHandleError("Error during CryptAcquireContext!\n"); } i tried this code but, its not working quite well (i get it from msdn) and this example don't work for me ( http://www.codeproject.com/KB/security/plaintextsessionkey.aspx ) so if anyone know how to generate sessionid using this class plz let me know tnx anyway!

    Read the article

  • Object to Network serialization - with an existing protocol

    - by cpf
    I'm writing a client for a server program written in C++. As is not unusual, all the networking protocol is in a format where packets can be easily memcopied into/out of a C++ structure (1 byte packet code, then different arrangements per packet type). I could do the same thing in C#, but is there an easier way, especially considering lots of the data is fixed-length char arrays that I want to play with as strings? Or should I just suck it up and convert types as needed? I've looked at using the ISerializable interface, but it doesnt look as low level as is required.

    Read the article

  • How can I get Perl to detect the bad UTF-8 sequences?

    - by gorilla
    I'm running Perl 5.10.0 and Postgres 8.4.3, and strings into a database, which is behind a DBIx::Class. These strings should be in UTF-8, and therefore my database is running in UTF-8. Unfortunatly some of these strings are bad, containing malformed UTF-8, so when I run it I'm getting an exception DBI Exception: DBD::Pg::st execute failed: ERROR: invalid byte sequence for encoding "UTF8": 0xb5 I thought that I could simply ignore the invalid ones, and worry about the malformed UTF-8 later, so using this code, it should flag and ignore the bad titles. if(not utf8::valid($title)){ $title="Invalid UTF-8"; } $data->title($title); $data->update(); However Perl seems to think that the strings are valid, but it still throws the exceptions. How can I get Perl to detect the bad UTF-8?

    Read the article

  • Maximum number of bytes that can be sent on a TCP connection

    - by iamrohitbanga
    I initially assumed that since tcp has a sequence number field of 32 bits and each byte sent on a tcp connection is labeled with a unique number, maximum number of bytes that can be sent on a tcp connection is about 2^32-1 or 2^32-2 (which?). but now I feel that since TCP is a sliding window protocol, the wraparound of sequence numbers during the connection should not have an affect on the maximum number of bytes that can be sent over a tcp connection as long as the when wraparound occurs the old packet is no longer in the network (it is sent after 2*MSL). What is the correct answer?

    Read the article

  • utf-8 convertion doesn't work always

    - by Marco Piccinni
    I searched into other stack before to type here and I didn't find anythong similar. I have to scrape different utf-8 webpages which contain text like "Oggi è una bellissima giornata" the problem is on the characther "è" I extract this text with jtidy and xpath query expression and I convert it with byte[] content = filteredEncodedString.getBytes("utf-8"); String result = new String(content,"utf-8"); where filteredEncodedString contains the text "Oggi è una bellissima giornata". This procedures works on the most webpages analyzed so far but in some case it doesn't extract a utf-8 string. Page encoding is always the same as the text is similar. Any ideas about the problem? thanks Marco

    Read the article

  • Me As Child Type In General Function

    - by Steven
    I have a MustInherit Parent class with two Child classes which Inherit from the Parent. How can I use (or Cast) Me in a Parent function as the the child type of that instance? EDIT: My actual goal is to be able to serialize (BinaryFormatter.Serialize(Stream, Object)) either of my child classes. However, "repeating the code" in each child "seems" wrong. EDIT2: This is my Serialize function. Where should I implement this function? Copying and pasting to each child doesn't seem right, but casting the parent to a child doesn't seem right either. Public Function Serialize() As Byte() Dim bFmt As New BinaryFormatter() Dim mStr As New MemoryStream() bFmt.Serialize(mStr, Me) Return mStr.ToArray() End Function

    Read the article

  • C++ : size of int, long, etc...

    - by Jérôme
    I'm looking for detailed informations regarding the size of basic C++ types. I know that it depends on the architecture (16 bits, 32 bits, 64 bits) and the compiler. But are there any standards ? I'm using Visual Studio 2008 on a 32 bit achitecture. Here is what I get : char : 1 byte short : 2 bytes int : 4 bytes long : 4 bytes float : 4 bytes double : 8 bytes I tried to find, without much success, reliable informations telling the sizes of char, short, int , long, double, float (and other types I don't think of) under different architecture and compiler.

    Read the article

  • using C# how to convert iso8859-1 encoded text files that contain Latin-1 accented characters to utf

    - by Tim
    I am being sent text files saved in iso88591-1 format that contain accented characters from the Latin-1 range (as well as normal ASCII a-z etc). How to convert these files to utf-8 using C# so that the single-byte accented characters in iso8859-1 become valid utf-8 characters? I have tried to use a StreamReader with ASCIIEncoding, and then converting the ascii string to UTF-8 by instantiating an ascii encoding and a utf8 encoding and then using Encoding.Convert(ascii, utf8, ascii.GetBytes( asciiString) ) — but the accented characters are being rendered as question marks. What step am I missing? Thanks

    Read the article

  • How to get Host OS (operating system) parameter of ZIP archive in C#

    - by bao
    Look. I have ZIP archives prepared in different os'es: mac, linux, windows. In windows file names encoded in DOS CP866, mac & linux in UTF-8. I need to know (in code) in which os zip file was prepared to decode file names correctly. There is a Host OS paramterer in "Central directory structure" of zip file (look http://www.fileformat.info/format/zip/corion.htm ). How to get 0005h 1 byte Host OS parameter in C#?

    Read the article

  • Multi-threaded downloader in C# question

    - by blez
    Currently I have multi-threaded downloader class that uses HttpWebRequest/Response. All works fine, it's super fast, BUT.. the problem is that the data needs to be streamed while it's downloading to another app. That means that it must be streamed in the right order, the first chunk first, and then the next in the queue. Currently my downloader class is sync and Download() returns byte[]. In my async multi-threaded class I make for example, list with 4 empty elements (for slots) and I pass each index of the slot to each thread using the Download() function. That simulates synchronization, but that's not what I need. How should I do the queue thing, to make sure the data is streamed as soon as the first chunk start downloading.

    Read the article

  • Java String to SHA1

    - by AeroDroid
    I'm trying to make a simple String to SHA1 converter in Java and this is what I've got... public static String toSHA1(byte[] convertme) { MessageDigest md = null; try { md = MessageDigest.getInstance("SHA-1"); } catch(NoSuchAlgorithmException e) { e.printStackTrace(); } return new String(md.digest(convertme)); } When I pass it toSHA1("password".getBytes()), I get "[?a?????%l?3~??." I know it's probably a simple encoding fix like UTF-8, but could someone tell me what I should do to get what I want which is "5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8"? Or am I doing this completely wrong? Thanks a lot!

    Read the article

  • How to convert from integer to unsigned char in C, given integers larger than 256?

    - by Alf_InPogform
    As part of my CS course I've been given some functions to use. One of these functions takes a pointer to unsigned chars to write some data to a file (I have to use this function, so I can't just make my own purpose built function that works differently BTW). I need to write an array of integers whose values can be up to 4095 using this function (that only takes unsigned chars). However am I right in thinking that an unsigned char can only have a max value of 256 because it is 1 byte long? I therefore need to use 4 unsigned chars for every integer? But casting doesn't seem to work with larger values for the integer. Does anyone have any idea how best to convert an array of integers to unsigned chars?

    Read the article

  • KeybListener problem

    - by rgksugan
    In my apllication i am using a jpanel in which i want to add a key listener. I did it. But it doesnot work. Is it because i am using a swingworker to update the contents of the panel every second. Here is my code to update the panel RenderedImage image = ImageIO.read(new ByteArrayInputStream((byte[]) get())); Graphics graphics = remote.rdpanel.getGraphics(); if (graphics != null) { Image readyImage = new ImageIcon(UtilityFunctions.convertRenderedImage(image)).getImage(); graphics.drawImage(readyImage, 0, 0, remote.rdpanel.getWidth(), remote.rdpanel.getHeight(), null); }

    Read the article

  • Quick way to do data lookup in PHP

    - by Ghostrider
    I have a data table with 600,000 records that is around 25 megabytes large. It is indexed by a 4 byte key. Is there a way to find a row in such dataset quickly with PHP without resorting to MySQL? The website in question is mostly static with minor PHP code and no database dependencies and therefore fast. I would like to add this data without having to use MySQL if possible. In C++ I would memory map the file and do a binary search in it. Is there a way to do something similar in PHP?

    Read the article

  • How i store the images pixels in matrix form?

    - by Rajendra Bhole
    Hi, I developing an application in which the pixelize image i want to be store in matrix format. The code is as follows. struct pixel { //unsigned char r, g, b,a; Byte r, g, b; int count; }; (NSInteger) processImage1: (UIImage*) image { // Allocate a buffer big enough to hold all the pixels struct pixel* pixels = (struct pixel*) calloc(1, image.size.width * image.size.height * sizeof(struct pixel)); if (pixels != nil) { // Create a new bitmap CGContextRef context = CGBitmapContextCreate( (void*) pixels, image.size.width, image.size.height, 8, image.size.width * 4, CGImageGetColorSpace(image.CGImage), kCGImageAlphaPremultipliedLast ); NSLog(@"1=%d, 2=%d, 3=%d", CGImageGetBitsPerComponent(image), CGImageGetBitsPerPixel(image),CGImageGetBytesPerRow(image)); if (context != NULL) { // Draw the image in the bitmap CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), image.CGImage); NSUInteger numberOfPixels = image.size.width * image.size.height; I confusing about how to initialize the 2-D matrix in which the matrix store data of pixels.

    Read the article

  • in java, which is better - three arrays of booleans or 1 array of bytes?

    - by joe_shmoe
    I know the question sounds silly, but consider this: I have an array of items and a labelling algorithm. at any point the item is in one of three states. The current version holds these states in a byte array, where 0, 1 and 2 represent the three states. alternatively, I could have three arrays of boolean - one for each state. which is better (consumes less memory) depends on how jvm (sun's version) stores the arrays - is a boolean represented by 1 bit? (p.s. don't start with all that "this is not the way OO/Java works" - I know, but here performance comes in front. plus the algorithm is simple and perfectly readable even in such form). Thanks a lot

    Read the article

  • What platforms have something other than 8-bit char?

    - by Craig McQueen
    Every now and then, someone on SO points out that char (aka 'byte') isn't necessarily 8 bits. It seems that 8-bit char is almost universal. I would have thought that for mainstream platforms, it is necessary to have an 8-bit char to ensure its viability in the marketplace. Both now and historically, what platforms use a char that is not 8 bits, and why would they differ from the "normal" 8 bits? When writing code, and thinking about cross-platform support (e.g. for general-use libraries), what sort of consideration is it worth giving to platforms with non-8-bit char? In the past I've come across some Analog Devices DSPs for which char is 16 bits. DSPs are a bit of a niche architecture I suppose. (Then again, at the time hand-coded assembler easily beat what the available C compilers could do, so I didn't really get much experience with C on that platform.)

    Read the article

  • How do I write raw binary data in Python?

    - by Chris B.
    I've got a Python program that stores and writes data to a file. The data is raw binary data, stored internally as str. I'm writing it out through a utf-8 codec. However, I get UnicodeDecodeError: 'charmap' codec can't decode byte 0x8d in position 25: character maps to <undefined> in the cp1252.py file. This looks to me like Python is trying to interpret the data using the default code page. But it doesn't have a default code page. That's why I'm using str, not unicode. I guess my questions are: How do I represent raw binary data in memory, in Python? When I'm writing raw binary data out through a codec, how do I encode/unencode it?

    Read the article

  • Map derived class as an independent one with FNH's Automap

    - by Anton Gogolev
    Hi! Basically, I have an ImageMetadata class and an Image class, which derives from ImageMetadata. Image adds one property: byte[] Content, which actually contains binary data. What I want to do is to map these two classes onto one table, but I absolutely do not need NHibernates' inheritance support to kick in. I want to tailor FNH Automap to produce something like: <class name="ImageMetadata" ...> <property name="Name" ... /> < ... /> <class name="Image" ...> <property name="Name" ... /> <property name="Content" ... /> < ... /> Is this at all possible?

    Read the article

  • Android Out of memory regarding png image

    - by turtleboy
    I have a jpg image in my app that shows correctly. In my listview i'd like to make the image more transparent so it is easier to see the text. I changed the image to a png format and altered it's opacity in GIMP. Now that the new image is in the app drawable folder. Im getting the following error. why? 09-28 09:24:07.560: I/global(20140): call socket shutdown, tmpsocket=Socket[address=/178.250.50.40,port=80,localPort=35172] 09-28 09:24:07.570: I/global(20140): call socket shutdown, tmpsocket=Socket[address=/212.169.27.217,port=84,localPort=55656] 09-28 09:24:07.690: D/dalvikvm(20140): GC_FOR_ALLOC freed 113K, 4% free 38592K/39907K, paused 32ms 09-28 09:24:07.690: I/dalvikvm-heap(20140): Forcing collection of SoftReferences for 28072816-byte allocation 09-28 09:24:07.740: D/dalvikvm(20140): GC_BEFORE_OOM freed 9K, 4% free 38582K/39907K, paused 43ms 09-28 09:24:07.740: E/dalvikvm-heap(20140): Out of memory on a 28072816-byte allocation. 09-28 09:24:07.740: I/dalvikvm(20140): "main" prio=5 tid=1 RUNNABLE 09-28 09:24:07.740: I/dalvikvm(20140): | group="main" sCount=0 dsCount=0 obj=0x40a57490 self=0x1b6e9a8 09-28 09:24:07.740: I/dalvikvm(20140): | sysTid=20140 nice=0 sched=0/0 cgrp=default handle=1074361640 09-28 09:24:07.740: I/dalvikvm(20140): | schedstat=( 2289118000 760844000 2121 ) utm=195 stm=33 core=1 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:486) 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.drawable.Drawable.createFromResourceStream(Drawable.java:773) 09-28 09:24:07.740: I/dalvikvm(20140): at android.content.res.Resources.loadDrawable(Resources.java:2042) 09-28 09:24:07.740: I/dalvikvm(20140): at android.content.res.TypedArray.getDrawable(TypedArray.java:601) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.View.<init>(View.java:2812) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.ViewGroup.<init>(ViewGroup.java:410) 09-28 09:24:07.740: I/dalvikvm(20140): at android.widget.LinearLayout.<init>(LinearLayout.java:174) 09-28 09:24:07.740: I/dalvikvm(20140): at android.widget.LinearLayout.<init>(LinearLayout.java:170) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Constructor.constructNative(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Constructor.newInstance(Constructor.java:417) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.createView(LayoutInflater.java:586) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.policy.impl.PhoneLayoutInflater.onCreateView(PhoneLayoutInflater.java:56) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.onCreateView(LayoutInflater.java:653) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:678) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:466) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:396) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:352) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.policy.impl.PhoneWindow.setContentView(PhoneWindow.java:278) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Activity.setContentView(Activity.java:1897) 09-28 09:24:07.740: I/dalvikvm(20140): at com.carefreegroup.ShowMoreDetails.onCreate(ShowMoreDetails.java:26) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Activity.performCreate(Activity.java:4543) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1071) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2181) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2260) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.access$600(ActivityThread.java:139) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1277) 09-28 09:24:07.740: I/dalvikvm(20140): at android.os.Handler.dispatchMessage(Handler.java:99) 09-28 09:24:07.740: I/dalvikvm(20140): at android.os.Looper.loop(Looper.java:156) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.main(ActivityThread.java:5045) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Method.invokeNative(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Method.invoke(Method.java:511) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551) 09-28 09:24:07.740: I/dalvikvm(20140): at dalvik.system.NativeStart.main(Native Method) 09-28 09:24:07.740: E/dalvikvm(20140): Out of memory: Heap Size=46115KB, Allocated=38582KB, Limit=65536KB 09-28 09:24:07.740: E/dalvikvm(20140): Extra info: Footprint=39907KB, Allowed Footprint=46115KB, Trimmed=892KB 09-28 09:24:07.740: E/Bitmap_JNI(20140): Create Bitmap Failed. 09-28 09:24:07.740: A/libc(20140): Fatal signal 11 (SIGSEGV) at 0x00000004 (code=1) 09-28 09:24:09.750: I/dalvikvm(20367): Turning on JNI app bug workarounds for target SDK version 10... 09-28 09:24:09.940: D/dalvikvm(20367): GC_CONCURRENT freed 864K, 21% free 3797K/4771K, paused 2ms+2ms thanks. [update] @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.showmoredetailslayout); actualCallTime = (TextView)findViewById(R.id.actualcalltime); doubleUp = (TextView)findViewById(R.id.doubleupcallid); needName = (TextView)findViewById(R.id.needname); needNameLabel = (TextView)findViewById(R.id.neednamelabel); getRotaDetails = (Button)findViewById(R.id.buttongetrotadetails); intent = this.getIntent(); String actualTimeIn = intent.getStringExtra("actTimeIn"); String actualTimeOut = intent.getStringExtra("actTimeOut"); String doubleUpValue = intent.getStringExtra("doubleUpValue"); String needNameWithCommas = intent.getStringExtra("needNameWithCommas"); callID = intent.getStringExtra("callID"); String[] needs = needNameWithCommas.split(","); actualCallTime.setText("This call was completed at " + actualTimeIn + " -" + actualTimeOut); if( ! doubleUpValue.equalsIgnoreCase("") || doubleUpValue.equalsIgnoreCase("]")){ doubleUp.setText("This call was not a double up "); }else{ doubleUp.setText("This call was a double up " + doubleUpValue); } needNameLabel.setText("Purpose of Call: "); for (int i = 0; i < needs.length; i++){ needName.append( needs[i] + "\n"); } getRotaDetails.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { Intent intent = new Intent(ShowMoreDetails.this, GetRotaDetails.class); intent.putExtra("callIDExtra", callID); startActivity(intent); } }); } }

    Read the article

  • Screen capture code produces black bitmap

    - by wadetandy
    I need to add the ability to take a screenshot of the entire screen, not just the current window. The following code produces a bmp file with the correct dimensions, but the image is completely black. What am I doing wrong? void CaptureScreen(LPCTSTR lpszFilePathName) { BITMAPFILEHEADER bmfHeader; BITMAPINFO *pbminfo; HBITMAP hBmp; FILE *oFile; HDC screen; HDC memDC; int sHeight; int sWidth; LPBYTE pBuff; BITMAP bmp; WORD cClrBits; RECT rcClient; screen = GetDC(0); memDC = CreateCompatibleDC(screen); sHeight = GetDeviceCaps(screen, VERTRES); sWidth = GetDeviceCaps(screen, HORZRES); //GetObject(screen, sizeof(BITMAP), &bmp); hBmp = CreateCompatibleBitmap ( screen, sWidth, sHeight ); // Retrieve the bitmap color format, width, and height. GetObject(hBmp, sizeof(BITMAP), (LPSTR)&bmp) ; // Convert the color format to a count of bits. cClrBits = (WORD)(bmp.bmPlanes * bmp.bmBitsPixel); if (cClrBits == 1) cClrBits = 1; else if (cClrBits bmiHeader.biSize = sizeof(BITMAPINFOHEADER); pbminfo-bmiHeader.biWidth = bmp.bmWidth; pbminfo-bmiHeader.biHeight = bmp.bmHeight; pbminfo-bmiHeader.biPlanes = bmp.bmPlanes; pbminfo-bmiHeader.biBitCount = bmp.bmBitsPixel; if (cClrBits bmiHeader.biClrUsed = (1bmiHeader.biCompression = BI_RGB; // Compute the number of bytes in the array of color // indices and store the result in biSizeImage. // The width must be DWORD aligned unless the bitmap is RLE // compressed. pbminfo-bmiHeader.biSizeImage = ((pbminfo-bmiHeader.biWidth * cClrBits +31) & ~31) /8 * pbminfo-bmiHeader.biHeight; // Set biClrImportant to 0, indicating that all of the // device colors are important. pbminfo-bmiHeader.biClrImportant = 0; CreateBMPFile(lpszFilePathName, pbminfo, hBmp, memDC); } void CreateBMPFile(LPTSTR pszFile, PBITMAPINFO pbi, HBITMAP hBMP, HDC hDC) { HANDLE hf; // file handle BITMAPFILEHEADER hdr; // bitmap file-header PBITMAPINFOHEADER pbih; // bitmap info-header LPBYTE lpBits; // memory pointer DWORD dwTotal; // total count of bytes DWORD cb; // incremental count of bytes BYTE *hp; // byte pointer DWORD dwTmp; int lines; pbih = (PBITMAPINFOHEADER) pbi; lpBits = (LPBYTE) GlobalAlloc(GMEM_FIXED, pbih-biSizeImage); // Retrieve the color table (RGBQUAD array) and the bits // (array of palette indices) from the DIB. lines = GetDIBits(hDC, hBMP, 0, (WORD) pbih-biHeight, lpBits, pbi, DIB_RGB_COLORS); // Create the .BMP file. hf = CreateFile(pszFile, GENERIC_READ | GENERIC_WRITE, (DWORD) 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, (HANDLE) NULL); hdr.bfType = 0x4d42; // 0x42 = "B" 0x4d = "M" // Compute the size of the entire file. hdr.bfSize = (DWORD) (sizeof(BITMAPFILEHEADER) + pbih-biSize + pbih-biClrUsed * sizeof(RGBQUAD) + pbih-biSizeImage); hdr.bfReserved1 = 0; hdr.bfReserved2 = 0; // Compute the offset to the array of color indices. hdr.bfOffBits = (DWORD) sizeof(BITMAPFILEHEADER) + pbih-biSize + pbih-biClrUsed * sizeof (RGBQUAD); // Copy the BITMAPFILEHEADER into the .BMP file. WriteFile(hf, (LPVOID) &hdr, sizeof(BITMAPFILEHEADER), (LPDWORD) &dwTmp, NULL); // Copy the BITMAPINFOHEADER and RGBQUAD array into the file. WriteFile(hf, (LPVOID) pbih, sizeof(BITMAPINFOHEADER) + pbih-biClrUsed * sizeof (RGBQUAD), (LPDWORD) &dwTmp, ( NULL)); // Copy the array of color indices into the .BMP file. dwTotal = cb = pbih-biSizeImage; hp = lpBits; WriteFile(hf, (LPSTR) hp, (int) cb, (LPDWORD) &dwTmp,NULL); // Close the .BMP file. CloseHandle(hf); // Free memory. GlobalFree((HGLOBAL)lpBits); }

    Read the article

  • F# powerpack and distribution

    - by rwallace
    I need arbitrary precision rational numbers, which I'm given to understand are available in the F# powerpack. My question is about the mechanics of distribution; my program needs to be able to compile and run both on Windows/.Net and Linux/Mono at least, since I have potential users on both platforms. As I understand it, the best procedure is: Download the powerpack .zip, not the installer. Copy the DLL into my program directory. Copy the accompanying license file into my program directory, to make sure everything is above board. Declare references and go ahead and use the functions I need. Ship the above files along with my source and binary, and since the DLL uses byte code, it will work fine on any platform. Is this the correct procedure? Am I missing anything?

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >