Search Results

Search found 4838 results on 194 pages for 'binary heap'.

Page 15/194 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • How could I represent 1.625 by 0 or a 1 (binary digit)?

    - by pepito
    This is an excerpt from wikipedia about 'full rate' speech coding standard. Full Rate or FR or GSM-FR or GSM 06.10 was the first digital speech coding standard used in the GSM digital mobile phone system. The bit rate of the codec is 13 kbit/s, or 1.625 bits/audio sample. And this one is an excerpt from wikipedia about bit. In computing parlance, bit is the abbreviation for a single binary digit, represented by a 0 or a 1. How could I represent 1.625 by 0 or a 1? Actually, that's my lecturer's question that I could not answer. Some links to papers are more than welcome. Thanks in advance.

    Read the article

  • How to implement dynamic binary search for search and insert operations of n element (C or C++)

    - by iecut
    The idea is to use multiple arrays, each of length 2^k, to store n elements, according to binary representation of n.Each array is sorted and different arrays are not ordered in any way. In the above mentioned data structure, SEARCH is carried out by a sequence of binary search on each array. INSERT is carried out by a sequence of merge of arrays of the same length until an empty array is reached. More Detail: Lets suppose we have a vertical array of length 2^k and to each node of that array there attached horizontal array of length 2^k. That is, to the first node of vertical array, a horizontal array of length 2^0=1 is connected,to the second node of vertical array, a horizontal array of length 2^1= 2 is connected and so on. So the insert is first carried out in the first horizontal array, for the second insert the first array becomes empty and second horizontal array is full with 2 elements, for the third insert 1st and 2nd array horiz. array are filled and so on. I implemented the normal binary search for search and insert as follows: int main() { int a[20]= {0}; int n, i, j, temp; int *beg, *end, *mid, target; printf(" enter the total integers you want to enter (make it less then 20):\n"); scanf("%d", &n); if (n = 20) return 0; printf(" enter the integer array elements:\n" ); for(i = 0; i < n; i++) { scanf("%d", &a[i]); } // sort the loaded array, binary search! for(i = 0; i < n-1; i++) { for(j = 0; j < n-i-1; j++) { if (a[j+1] < a[j]) { temp = a[j]; a[j] = a[j+1]; a[j+1] = temp; } } } printf(" the sorted numbers are:"); for(i = 0; i < n; i++) { printf("%d ", a[i]); } // point to beginning and end of the array beg = &a[0]; end = &a[n]; // use n = one element past the loaded array! // mid should point somewhere in the middle of these addresses mid = beg += n/2; printf("\n enter the number to be searched:"); scanf("%d",&target); // binary search, there is an AND in the middle of while()!!! while((beg <= end) && (*mid != target)) { // is the target in lower or upper half? if (target < *mid) { end = mid - 1; // new end n = n/2; mid = beg += n/2; // new middle } else { beg = mid + 1; // new beginning n = n/2; mid = beg += n/2; // new middle } } // find the target? if (*mid == target) { printf("\n %d found!", target); } else { printf("\n %d not found!", target); } getchar(); // trap enter getchar(); // wait return 0; } Could anyone please suggest how to modify this program or a new program to implement dynamic binary search that works as explained above!!

    Read the article

  • ubuntu 13.10 kvm binary is deprecated, please use qemu-system-x86_64

    - by ??1986
    I just upgrade from 13.04 to 13.10 and I have this issue when I run my KVM Unable to complete install: 'internal error: process exited while connecting to monitor: W: kvm binary is deprecated, please use qemu-system-x86_64 instead char device redirected to /dev/pts/10 (label charserial0) failed to initialize KVM: Device or resource busy Detail Error: Traceback (most recent call last): File "/usr/share/virt-manager/virtManager/asyncjob.py", line 96, in cb_wrapper callback(asyncjob, *args, **kwargs) File "/usr/share/virt-manager/virtManager/create.py", line 1983, in do_install guest.start_install(False, meter=meter) File "/usr/lib/python2.7/dist-packages/virtinst/Guest.py", line 1246, in start_install noboot) File "/usr/lib/python2.7/dist-packages/virtinst/Guest.py", line 1314, in _create_guest dom = self.conn.createLinux(start_xml or final_xml, 0) File "/usr/lib/python2.7/dist-packages/libvirt.py", line 2892, in createLinux if ret is None:raise libvirtError('virDomainCreateLinux() failed', conn=self) libvirtError: internal error: process exited while connecting to monitor: W: kvm binary is deprecated, please use qemu-system-x86_64 instead char device redirected to /dev/pts/8 (label charserial0) failed to initialize KVM: Device or resource busy

    Read the article

  • Is it safe to delete rotated MySQL binary logs?

    - by Milan Babuškov
    I have a MySQL server with binary logging active. Once a days logs file is "rotated", i.e. MySQL seems to stop writing to it and creates and new log file. For example, I currently have these files in /var/lib/mysql -rw-rw---- 1 mysql mysql 10485760 Jun 7 09:26 ibdata1 -rw-rw---- 1 mysql mysql 5242880 Jun 7 09:26 ib_logfile0 -rw-rw---- 1 mysql mysql 5242880 Jun 2 15:20 ib_logfile1 -rw-rw---- 1 mysql mysql 1916844 Jun 6 09:20 mybinlog.000004 -rw-rw---- 1 mysql mysql 61112500 Jun 7 09:26 mybinlog.000005 -rw-rw---- 1 mysql mysql 15609789 Jun 7 13:57 mybinlog.000006 -rw-rw---- 1 mysql mysql 54 Jun 7 09:26 mybinlog.index and mybinlog.000006 is growing. Can I simply take mybinlog.000004 and mybinlog.000005, zip them up and transfer to another server, or I need to do something else before? What info is stored in mybinlog.index? Only the info about the latest binary log? UPDATE: I understand I can delete the logs with PURGE BINARY LOGS which updates mybinlog.index file. However, I need to transfer logs to another computer before deleting them (I test if backup is valid on another machine). To reduce the transfer size, I wish to bzip2 the files. What will PURGE BINARY LOGS do if log files are not "there" anymore?

    Read the article

  • How can I find the common ancestor of two nodes in a binary tree?

    - by Siddhant
    The Binary Tree here is not a Binary Search Tree. Its just a Binary Tree. The structure could be taken as - struct node { int data; struct node *left; struct node *right; }; The maximum solution I could work out with a friend was something of this sort - Consider this binary tree (from http://lcm.csa.iisc.ernet.in/dsa/node87.html) : The inorder traversal yields - 8, 4, 9, 2, 5, 1, 6, 3, 7 And the postorder traversal yields - 8, 9, 4, 5, 2, 6, 7, 3, 1 So for instance, if we want to find the common ancestor of nodes 8 and 5, then we make a list of all the nodes which are between 8 and 5 in the inorder tree traversal, which in this case happens to be [4, 9, 2]. Then we check which node in this list appears last in the postorder traversal, which is 2. Hence the common ancestor for 8 and 5 is 2. The complexity for this algorithm, I believe is O(n) (O(n) for inorder/postorder traversals, the rest of the steps again being O(n) since they are nothing more than simple iterations in arrays). But there is a strong chance that this is wrong. :-) But this is a very crude approach, and I'm not sure if it breaks down for some case. Is there any other (possibly more optimal) solution to this problem?

    Read the article

  • UBJsonReader (Libgdx) unable to to read UBJson from Python(Blender)

    - by daniel
    I am working on an export tool from Blender to Libgdx, exports like custom attributes and other information (Almost completed), this is a very cool tool that will speed up a lot your works, after I completed I will send to public to contribute forum, Export format is uses python's Standard Json module and readable text, it of course works fine, but I wanna also have a Binary Json export for faster load, so users can Export Straight to Libgdx, but after I search I found that UBJson with draft9.py (simpleubjson 0.6.1) encode is seems matches with one FBXConverter's UBJsonWriter( Xoppa wrote), but when I export, I am not able to read the file, and send this errors (Java heap space) seems this is a different between byte sizes in UBJson(python) and UBJsonReader. how can I write a correct one in python that matches with Libgdx's UBJsonReader, and would be cross-platform? Exception in thread "LWJGL Application" com.badlogic.gdx.utils.GdxRuntimeException: java.lang.OutOfMemoryError: Java heap space at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:120) Caused by: java.lang.OutOfMemoryError: Java heap space at com.badlogic.gdx.utils.UBJsonReader.readString(UBJsonReader.java:162) at com.badlogic.gdx.utils.UBJsonReader.parseString(UBJsonReader.java:150) at com.badlogic.gdx.utils.UBJsonReader.parseObject(UBJsonReader.java:112) at com.badlogic.gdx.utils.UBJsonReader.parse(UBJsonReader.java:59) at com.badlogic.gdx.utils.UBJsonReader.parse(UBJsonReader.java:52) at com.badlogic.gdx.utils.UBJsonReader.parse(UBJsonReader.java:36) at com.badlogic.gdx.utils.UBJsonReader.parse(UBJsonReader.java:45) at com.me.gdximportexport.GdxImportExport.create(GdxImportExport.java:43) at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:136) at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:114) Tested on UbuntuStudio 13.10 with OpenJdk 7, and Windows 7 with jdk 7 Thanks for any guides.

    Read the article

  • Java: confirm method Binary division and find remainder is correct?

    - by cadwag
    I am parsing binary files and have to implement a CRC algorithm to ensure the file is not corrupted. Problem is that I can't seem to get the binary math working when using larger numbers. The example I'm trying to get working: BigInteger G = new BigInteger("11001", 2); BigInteger M = new BigInteger("1110010000", 2); BigInteger R = M.remainder(G); I am expecting: R = "0101" But I am getting: R = "1100" I am assuming the remainder of 0101 is correct since it is given to me in this book I am using as a reference for the CRC algorithm (it's not based in Java), but I can't seem to get it working. I can get small binary calculations to work that I have solved by hand, but not the larger ones. I'll admit that I haven't worked the larger ones by hand yet, that is my next step, but I wanted to see if someone could point out a glaring flaw I have in my code. Can anyone confirm or deny that my methodology is correct? Thanks

    Read the article

  • what is the format of a binary image & how is it different from jpg, png images?

    - by Rahulsingh190
    I searched the internet for the basic formats of image files (e.g. .jpg, .png, .gif) as there is a specific format for .doc, .pdf etc. But didn't got anything relevant. And today I also came with an .bin image format. BIN signifies that the image is in the Binary format. So, what is the Internal format of .jpg image file. And How is it different from .bin (Binary) format. Because everything is Basically saved in Binary Form. And How is BITMAP Image different from .jpg format.

    Read the article

  • C++ how to store integer into a binary file??

    - by blaxc
    i gt a struct with 2 integer, i want to store them in a binary file and read it again... here is my code... struct pw { int a; int b; }; void main(){ pw* p = new pw(); pw* q = new pw(); std::ofstream fout(ADMIN_FILE, ios_base::out | ios_base::binary | ios_base::trunc); std::ifstream fin(ADMIN_FILE, ios_base::in | ios_base::binary); p->a=123; p->b=321; fout.write((const char*)p, sizeof(pw)); fin.write((char*)q, sizeof(pw)); fin.close(); cout<< q->a << endl;} my output is 0. anyone can tell me what is the problem?

    Read the article

  • Stack & heap understanding question

    - by Petr
    Hi, I would really appreciate if someone could tell me whether I understand it well: class X { A a1=new A() //reference on the stack, object value on the heap a1.VarA=5; //on the stack - value type A a2=new A() //reference on the stack, object value on the heap a2.VarA=10; //on the stack - value type a1=a2; //on the stack, the target of a1 reference is updated to a2 value on the heap //also both a1 and a2 references are on the stack, while their "object" values on the heap. But what about VarA variable, its still pure value type? } class A { int VarA; }

    Read the article

  • Java heap keeps on shrinking! What is happening in this graph of heap size?

    - by chillitom
    Hi Guys, This is a screen shot of a JVM (win64, 6u17) running ActiveMQ, after every garbage collection the heap size is reducing. As the heap size reduces garbage collection gets more frequent and the heap reduces more quickly. Eventually the VM locks up as it's spending all it's time in GC. -Xms is the default and -Xmx is 2048mb. What is happening!!? How can I avoid this? http://imagebin.org/92614 n.b originally posted on serverfault.com, moved to stackoverflow.com as requested

    Read the article

  • Where to check for heap memory size and its usage statistics etc... in windows?

    - by AKN
    Lets say I open some application or process. Did some work with that. Now I closed it. Need to know whether this application caused any memory leak. i.e used up some heap memory and not cleared it properly. Can I get this statistics some how? I'm using Visual Studio (for development) under Windows OS. Even I would be interested in knowing this information for any 3rd party application.

    Read the article

  • How to generate and encode (for use in GA), random, strict, binary rooted trees with N leaves?

    - by Peter Simon
    First, I am an engineer, not a computer scientist, so I apologize in advance for any misuse of nomenclature and general ignorance of CS background. Here is the motivational background for my question: I am contemplating writing a genetic algorithm optimizer to aid in designing a power divider network (also called a beam forming network, or BFN for short). The BFN is intended to distribute power to each of N radiating elements in an array of antennas. The fraction of the total input power to be delivered to each radiating element has been specified. Topologically speaking, a BFN is a strictly binary, rooted tree. Each of the (N-1) interior nodes of the tree represents the input port of an unequal, binary power splitter. The N leaves of the tree are the power divider outputs. Given a particular power divider topology, one is still free to map the power divider outputs to the array inputs in an arbitrary order. There are N! such permutations of the outputs. There are several considerations in choosing the desired ordering: 1) The power ratio for each binary coupler should be within a specified range of values. 2) The ordering should be chosen to simplify the mechanical routing of the transmission lines connecting the power divider. The number of ouputs N of the BFN may range from, say, 6 to 22. I have already written a genetic algorithm optimizer that, given a particular BFN topology and desired array input power distribution, will search through the N! permutations of the BFN outputs to generate a design with compliant power ratios and good mechanical routing. I would now like to generalize my program to automatically generate and search through the space of possible BFN topologies. As I understand it, for N outputs (leaves of the binary tree), there are $C_{N-1}$ different topologies that can be constructed, where $C_N$ is the Catalan number. I would like to know how to encode an arbitrary tree having N leaves in a way that is consistent with a chromosomal description for use in a genetic algorithm. Also associated with this is the need to generate random instances for filling the initial population, and to implement crossover and mutations operators for this type of chromosome. Any suggestions will be welcome. Please minimize the amount of CS lingo in your reply, since I am not likely to be acquainted with it. Thanks in advance, Peter

    Read the article

  • How to create a complete binary tree of height 'h' using Python?

    - by Jack
    Here is the node structure class Node: def __init__(self, data): # initializes the data members self.left = None self.right = None self.parent = None self.data = data complete binary tree Definition: A binary tree in which every level, except possibly the deepest, is completely filled. At depth n, the height of the tree, all nodes must be as far left as possible. -- http://www.itl.nist.gov/div897/sqg/dads/HTML/completeBinaryTree.html I am looking for an efficient algorithm.

    Read the article

  • Can I use a binary literal in C or C++?

    - by hamza
    I need to work with a binary number. I tried writing: const x = 00010000 ; But it didn't work. I know that I can use an hexadecimal number that has the same value as 00010000 but I want to know if there is a type in C++ for binary numbers & if there isn't, is there another solution for my problem?

    Read the article

  • Increased CF JVM max heap size and now CF service will not start

    - by Erik Vold
    So I went in to ColdFusion Administrator and increased my CF JVM max heap size, then I was told I would need to restart the service, so I stopped the service, and then tried to start it again and got the following error message: Windows could not start the ColdFusion 8 Application Server on Local Computer. For more information, review the System Event Log. If this is a non-Microsoft service, contact the service vendor, and refer to service-specific error code 2. So I went to the Event Viewer application, then took a look at the Application log and saw an error that said: The ColdFusion 8 Application Server service could not be started. Check the server "coldfusion" log files for more information. So I went to my ColdFusion logs directory opened the server.log log file and I don't see anything useful in there or any of the other log files.. Any idea how I can change the JVM heap size back to what it was so that I can start CF again?

    Read the article

  • Deserialization error using Runtime Serialization with the Binary Formatter

    - by Lily
    When I am deserializing a hierarchy I get the following error The input stream is not a valid binary format. The starting contents (in bytes) are The input stream is not a valid binary format. The starting contents (in bytes) are: 20-01-20-20-20-FF-FF-FF-FF-01-20-20-20-20-20-20-20 ..." Any help please? Extra info: public void Serialize(ISyntacticNode person) { Stream stream = File.Open(fileName, FileMode.OpenOrCreate); try { BinaryFormatter binary = new BinaryFormatter(); pList.Add(person); binary.Serialize(stream, pList); stream.Close(); } catch { stream.Close(); } } public List<ISyntacticNode> Deserialize() { Stream stream = File.Open(fileName, FileMode.OpenOrCreate); BinaryFormatter binary = new BinaryFormatter(); try { pList = (List<ISyntacticNode>)binary.Deserialize(stream); binary.Serialize(stream, pList); stream.Close(); } catch { pList = new List<ISyntacticNode>(); binary.Serialize(stream, pList); stream.Close(); } return pList; } I am Serializing a hierarchy which is of type Proxem.Antelope.Parsing.ISyntacticNode Now I have gotten this error System.Runtime.Serialization.SerializationException: Binary stream '116' does not contain a valid BinaryHeader. Possible causes are invalid stream or object version change between serialization and deserialization. i'm using a different instance. How may I avoid this error please

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >