Search Results

Search found 395 results on 16 pages for 'truth'.

Page 9/16 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Real RAM latency

    - by user32569
    Hi, very quick question. When I look for RAM timing, I got 2 different explanations on what is CAS latency. First states, its the time after command to read has been issued from CPU and data are send to data bus. Second says its time betwen column in memory layout has been activated. So, where is the truth? I mean, when I want to know total RAM latenci, in case 1, ot would be just CAS times one clock time. In second case, it would be CAS+other things like RAstoCAS and so times one clock time. Thanks.

    Read the article

  • Ubuntu 11.04 64bits Keeps Randomly Freezing

    - by user971602
    This has been a real headache for me since the number of freezes has increase from twice a week to about 1 or twice a day. The system just halts and nothing can be done but hitting the restart button. At the beginning I thought it was related to Flash since I was getting random freezes when playing full screen flash videos online. I also thought it could be the wireless pci card. But the system has freezed using browsing around GNOME. The truth is, the freezes are really random and strange. I checked this thread Ubuntu keeps randomly freezing and try to ssh my computer using another one, but I could not ssh since it was really totally frozen. NumLock or CapsLock wasn't responding or blinking. Since I could not ssh I, also ignored this article https://wiki.ubuntu.com/X/Troubleshooting/Freeze According to my wife, the system has also halted under Win7 Pro 64bit but with less frequency. Here is my system configuration Intel Core i7 2600k ASRock Z68 Extreme3 gen3 Motherboard Crucial M4 128GB CT128M4SSD2 SSD WD Caviar Green WD10EADS 1TB SATA II G.SKILL Ripjaws X Series 8GB (2 x 4GB) DDR2 OCZ ModXStream 600W Power Supply Rosewill RNX-N300X PCI Wireless Adapter No external Graphics Card I remove the Wireless card and used Ethernet to see if the problem was the that, but I got a freeze after doing that. I also ran memtest86 and everything was ok. The only other thing I might suspect of is the SSD. I will try to clone the SSD to a HDD to see if that solve the problem. At this point I am stuck with the freezes. Do anyone have a clue of why this is happening and how can i solve this?

    Read the article

  • Isn't Java a quite good choice for desktop applications?

    - by tactoth
    At present most applications are still developed with C++, painfully. Lack of portability, in compatible libraries, memory leaks, slow compilation, and poor productivity. Even if you pick only a single from these shortages, it's still a big headache. However the surprising truth is that C++ remains the first choice for desktop applications. Compared to C++ Java has lots of advantages. The success in server side development shows that the language itself is good, Swing is also thought to be as programmer friendly as the highly recognized QT framework (No, never say even a single word about MFC!). All the disadvantages of C++ listed above has a solution in Java. "Performance!", Well that might still be the problem but to my experience it's a slight problem. I'd been using Java to decode some screen video and generate key frames. The video has a duration of more than 1 hour. The time spent on an average machine is just 1 minute. With C++ I don't expect even faster speed. In recent days there are many news on the JIT performance improvements, that make us feel Java is gradually becoming very suitable for desktop development, without people realizing it. Isn't it?

    Read the article

  • For photography use, is Unity is overheating my laptop? Should I try OpenSuse instead?

    - by SoT
    I am a perfect noob here in the Linux world. Previously was using Windows 7. Mine is an HP laptop - Intel core2duo T5470 @ 1.60GHz × 2 / 965GM with 2GB RAM. I installed Ubuntu 12.04TLS and is quite liking it's display. I really recognized it is 3D before knowing it was Unity 3D interface. My uses are image editing, home uses, downloads, browsing etc.. No video-editing/gaming at all. Being a Photography enthusiast I use image editing programs fairly more. But I am now feeling my laptop is getting a bit overheated - processor and hard-disk. I tried lm-sensor and could not make out much of it. Installed Xsensors.7. It gives the same output as lm-sensors gave me. It gives temperature for 4 things Temp1, temp2, temp3, and temp4. For "acpitz". Please guide me in this. However I wanted to ask something more. Which one is better for working with images - photography I mean - openSUSE 12.1 or Ubuntu with unity 3D? Can I get the display quality with the openSUSE distribution? I heard for laptops openSUSE uses power more efficiently, is there any truth? Please suggest me whether I should try openSUSE or not. If so with which GUI? KDE or GNOME? Thanks in advance. Regards SoT

    Read the article

  • Transformation of Product Management in Telecommunications for Rapid Launch of Next Generation Products

    - by raul.goycoolea
    @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }a:link, span.MsoHyperlink { color: blue; text-decoration: underline; }a:visited, span.MsoHyperlinkFollowed { color: purple; text-decoration: underline; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } The Telecom industry continues to evolve through disruptive products, uncertain markets, shorter product lifecycles and convergence of technologies. Today’s market has moved from network centric to consumer centric and focuses primarily on the customer experience. It has resulted in several product management challenges such as an increased complexity and volume of offerings, creating product variants, accelerating time-to-market, ability to provide multiple product views for varied stakeholders, leveraging OSS intelligence to BSS layer, product co-creation and increasing audit and security concerns for service providers. The document discusses how enterprise product management enabled by PLM-based product catalogue solutions helps to launch next generation products rapidly in the context of the Telecommunication Industry.   1.0.       Introduction   Figure 1: Business Scenario   Modern business demands the launch of complex products in a very short timeframe and effecting changes in the price plan faster without IT intervention. One of the key transformation initiatives companies are focusing on is in the area of product management transformation and operational efficiency improvement. As part of these initiatives, companies are investing in best- in-class COTs-based Product Management solutions developed on industry-wide standards.   The new COTs packages are planned to integrate with existing or new B/OSS systems to provide a strategic end-to-end agile solution for reduced time-to-market and order journey time. In addition, system rationalization is being undertaken to phase out legacy systems and migrate to strategic systems.   2.0.       An Overview of Product Management in Telecom   Product data in telecom is multi- dimensional and difficult to manage. It increased significantly due to the complexity of the product, product offerings on the converged network, increased volume of offerings, bundled offering structures and ever increasing regulatory requirements.   In addition, the shrinking product lifecycle in telecom makes it difficult to manage the dynamic product data. Mergers and acquisitions coupled with organic growth pose major challenges in product portfolio management. It is a roadblock in the journey towards becoming an agile organization.       Figure 2: Complexity in Product Management   Network Technology’ is the new dimension in telecom product management where the same products are realized through different networks i.e., Soiled network to Converged network. Consequently, the product solution is different.     Figure 3: Current Scenario - Pain Points in Product Management   The major business implications arising out of the current scenario are slow time-to-market and an inefficient process that affects innovation.   3.0. Transformation of Next Generation Product Management   Companies must focus on their Product Management Transformation Journey in the areas of:   ·       Management of single truth of product information across the organization/geographies which is currently managed in heterogeneous systems   ·       Management of the Intellectual Property (IP) on the product concept and partnership in the design of discrete components to integrate into the system   ·       Leveraging structured and unstructured product data within the extended enterprise to extract consumer insights and drive innovation   ·       Management of effective operational separation to comply with regulatory bodies   ·       Reuse of existing designs and add relevant features such as value-added services to enable effective product bundling     Figure 4: Next generation needs   PLM-based Enterprise Product Catalogue solutions efficiently address the above requirements and act as an enabler towards product management transformation and rapid product launch.   4.0. PLM-based Enterprise Product Management     Figure 5: PLM-based Enterprise Product Mastering   Enterprise Product Management (EPM) enables the business to manage complex product attributes of data in complex environments. Product Mastering helps create a 'single view' of the product by creating a business-driven, IT-supported environment where a global 'single truth record' is created, managed and reused.   4.1 The Business Case for Telco PLM-based solutions for Enterprise Product Management   ·       Telco PLM-based Product Mastering solutions provide a centralized authoring environment for product definition and control of all product data and rules   ·       PLM packages are designed to support multiple perspectives of product data (ordering perspective, billing perspective, provisioning perspective)   ·       Maintains relationships/links between different elements of the entire product definition   ·       Telco PLM packages are specialized in next generation lifecycle management requirements of products such as revision and state management, test and release management, role management and impact analysis)   ·       Takes into consideration all aspects of OSS product requirements compared to CRM product catalogue solutions where the product data managed is mostly order oriented and transactional     ·       New breed of Telco PLM packages are designed with 'open' standards such as SID and eTOM. They are interoperable, support integration frameworks such as subscription and notification.   ·       Telco PLM packages have developed good collaboration frameworks to integrate suppliers and partners into the product development value chain   4.2 Various Architectures/Approaches for Product Mastering using Telco PLM systems   4. 2.a Single Central Product Management (Mastering) Approach   Figure 6: Single Central Product Management (Master) Approach       This approach is implemented across verticals such as aerospace and automotive. It focuses on a physically centralized product master to which other sources are dependent on. The product definition data (Product bundles, service bundles, price plans, offers and discounts, product configuration rules and market campaigns) is created and maintained physically in a centralized environment. In addition, the product definition/authoring environment is centralized. The existing legacy product definition data available in CRM product catalogue, billing catalogue and the legacy product catalogue is migrated to the centralized PLM-based Enterprise Product Management solution.   Architectural changes must be made in the existing business landscape of applications to create and revise data because the applications have to refer to the central repository for approvals and validation of product configurations. It is achieved by modifying how the applications write data or how the applications can be adapted to use the rules to be managed and published.   Complete product configuration validation will be done in enterprise / central product catalogue and final configuration will be sent to the B/OSS system through the SOA compliant product distribution architecture. The approach/architecture enables greater control in terms of product data management and product data governance.   4.2.b Federated Product Management (Mastering) Architecture     Figure 7: Federated Product Management (Mastering) Architecture   In the federated product mastering approach, the basic unique product definition data (product id, description product hierarchy, basic price plans and simple product design rules) will be centrally created and will be maintained. And, the advanced product definition (Product bundling, promotions, offers & discount plans) will be created in respective down stream OSS systems. The advanced product definition (Product bundling, promotions, offers and discount plans) will be created in respective downstream OSS systems.   For example, basic product definitions such as attributes, product hierarchy and basic price plans will be created and maintained in Enterprise/Central product reference catalogue and distributed to downstream OSS systems. Respective downstream OSS systems build product bundles, promotions, advanced price plans over the basic product definition and master the advanced product definition. Central reference database accesses the respective other source product master data and assembles a point-in-time consolidated view of the product. The approach is typically adapted in some merger and acquisition scenarios where there is a low probability of a central physical authority managing the data. In addition, the migration effort in this case is minimal and there are no big architectural changes to the organization application landscape. However, this approach will not result in better product data management and data governance.   5.0 Customer Scenario – Before EPC deployment   A leading global telecommunications service provider wanted to launch a quad play and triple play service offering in the shortest possible lead time. The service provider was offering Broadband and VoIP services to customers. The company wanted to reuse a majority of the Broadband services and price plans and bundle them with new wireless and IPTV services for quad play and triple play. The challenges in launching the new service offerings were:       Figure 8: Triple Play Plan   ·       Broadband product data was stored in multiple product catalogues (CRM catalogue, Billing catalogue, spread sheets)   ·       Product managers spent a lot of time performing tasks involving duplication or re-keying of data. Manual effort caused errors, cost and time over-runs.   ·       No effective product and price data governance mechanism. Price change issues arising from the lack of data consistency across systems resulted in leakage of customer value and revenue.   ·       Product data had re-usability issues and was not in a structured format. It resulted in uncontrolled product portfolio creation and product management issues.   ·       Lack of enterprise product model resulted into product distribution challenges and thus delays in product launch.   ·       Designers are constrained by existing legacy product management solutions to model product/service requirements and product configuration rules such as upgrading, downgrading and cross selling.    5.1 Customer Scenario - After EPC deployment     Figure 9: SOA-based end-to-end EPC Solution   The company deployed PLM-based Enterprise Product Catalogue solutions to launch quad play service after evaluating various product catalogues. The broadband product offering, service and price data were migrated to the new system, and the product and price plan hierarchy for new offerings were created using the entities defined in the Enterprise Product Model. Supplier product catalogue data such as routers and set up boxes were loaded onto the new solution through SOA-based web service. Price plans and configuration rules were built in the new system. The validated final product configurations were extracted from the product catalogue in a SID format and were distributed to the downstream B/OSS systems through exposed SOA-based web services. The transformations required for the B/OSS system were handled using the transformation layer as part of the solution.   6.0 How PLM enabled Product Management Transformation         Figure 10: Product Management Transformation     PLM-based Product Catalogue Solution helped the customer reduce the product launch cycle time by 30% and enable transformation of Product Management for next generation services.   7.0 Conclusion   On the one hand, the telecom industry is undergoing changes due to disruptions, uncertain product markets and increased complexity of products. On the other hand, the ARPU is decreasing year-on-year. Communications Service Providers are embarking on convergence, bundled service offerings, flexibility to cross-sell and up-sell, introduce new value-added services, leverage Web 2.0 concepts and network capabilities. Consequently, large scale IT transformation initiatives to improve their ARPU supporting network and business transformations are a business imperative. Product Management has become a focus area. Companies are investing in best-in- class COTS solutions to reduce time-to-market, ensure rapid service delivery and improve operational efficiency. An efficient PLM-based enterprise product mastering solution plays a key role in achieving zero touch automation and rapid product launch.   References:   1.     Preston G.Smith, Donald G.Reineristsem, Van Nostrand Reinhold “Developing Products in Half the time”.   2.     John G. Innes, "Achieving Successful Product Change", Pitman Publishing.   3.     D T Pham and R M Setchi (16th Jan, 2001) "Authoring environment for documentation development" University of Wales Cardiff, U.K., Proceedings on Institution of Mechanical Engineers, Vol. 215, Part B.   4.     Oracle Product Hub for Communications:   http://www.oracle.com/us/products/applications/master-data-management/product-hub-082059.html  

    Read the article

  • What makes a bad programming language bad?

    - by sub
    We have all seen things like the typing system of JavaScript (There is a funny post including a truth table somewhere around here). I consider this one of the main things that makes a programming language bad. Other things that spring to mind: Bad Error messages (Either obfuscated so you can't figure out whats wrong, not existing or simply too long and red) The language wasn't planned and just grew uncontrolled in all directions (PHP?) The language encourages bad programm(er/ing) habits such as: Global variables everywhere, bad variable names Inconsistent naming conventions inside the language I can't come up with any more at the moment and would be very happy to read what you think about this. What shouldn't be missing in a language created to be as bad (from the perspectives of the programmer, the company that hires to programmer, the team leader and the customer) as possible? (I ask this because I'm designing a bad, experimental language at the moment)

    Read the article

  • Table comment length in mysql

    - by azerole
    According to MySQL manual, table comments are limited to 60 characters. I'm designing the schema in MySQL Workbench, which does not enforce this limit, so I end up with writing more than 60 symbols quite often, and this causes the SQL script to fail. To tell the truth I would be quite happy with table comments being internal to my schema (i.e. not exported to the actual database), but Workbench doesn't allow this either. Hence my question: is there a way to increase maximum length of table comment in MySQL to 255?

    Read the article

  • Reasons why one should not call the garbage collector directly.

    - by Shimrod
    Hi everyone, I'm currently writing a paper for my company, about how to avoid calling the garbage collector directly from the code (when playing with COM objects for instance). I know this is a bad practice, and should be only considered in very rare cases, but I can't seem to find a way to tell why it should be avoided. And I don't want to rely on the "The G.C. is smarter than you" principle (even if it is the truth :-) ) So can you tell me some clues about why you think one should avoid to call the garbage collector directly ? (performance impact?) Or maybe if you have links about this particular topic, they would be very helpful. Thanks in advance !

    Read the article

  • Multiple Condition Coverage Testing

    - by David Relihan
    Hi Folks, When using the White Box method of testing called Multiple Condition Coverage, do we take all conditional statements or just the ones with multiple conditions? Now maybe the clues in the name but I'm not sure. So if I have the following method void someMethod() { if(a && b && (c || (d && e)) ) //Conditional A { } if(z && q) // Conditional B { } } Do I generate the truth table for just "Conditional A", or do I also do Conditional B? Thanks,

    Read the article

  • Why does the ASP.Net Web Forms model "suck"?

    - by Daniel Magliola
    I've heard Jeff Atwood, Joel Spolsky, and many other legendary people talk about how the ASP.Net Web Forms model sucks. (So this question is kind of directed to them, hopefully Jeff is reading) Now, I highly respect their opinion, given their background and expertise, but truth be told, I absolutely LOVE ASP.Net. I think the model is brilliant, and it sucks if you have no idea what you're doing, but once you understand how to control ViewState, when to use handlers instead of pages, etc, it is generations ahead of all the other models. So every time I hear someone complain about how it sucks, I can't help ask the same question... Why? What is it that's so bad about it? I appreciate all opinions. I'm assuming there's probably a post at Jeff's blog talking about this too...

    Read the article

  • Why does GANTracker outputs an error "GANTracker.m" not found?

    - by favo
    Hi, I have used the Google Analytics Tracker in a previous iPhone OS project. Everything was working fine and I copy & pasted the GANTracker Library and the Tracker initialization. When starting my new project it tells me: Xcode could not locate source file: GANTracker.m (line: 177) To tell the truth, I don't know where to start to debug this one. I have included the library using #import "GANTracker.h". The error message occurs right within the application didFinishLaunchingWithOptions and does not seem to have any connection to what is really going on. If I set the breakpoint to [window makeKeyAndVisible]; for example and wait a second, it occurs right after that. That makes it look like there is something in the background going on with the GANTracker. The Tracker itself is created a little later by: [[GANTracker sharedTracker] startTrackerWithAccountID:@"xx" dispatchPeriod:10 delegate:nil]; [[GANTracker sharedTracker] trackPageview:@"pageview" withError:nil]; Thank you all in advance for your help!

    Read the article

  • How could I get informations about the SIM card on Windows CE

    - by jeleb
    Hello, I am using a device running on WindowsCE 5.00. I cannot connect the GPRS, there is an error message telling that the port cannot be opened and may be in use by another application. I am not sure but I think the port it is talking about is a comp port to the SIM card that cannot be opened because the SIM doesn't work anymore. To tell the truth, it is likely that one of my coworker did lock the SIM and kept his mouth shut ... How could I get any information about the SIM on Windows CE ? I am new at this and Windows CE is not really user friendly ... I have been strongly searching this on internet but didn't find anything. Any kind of information will be welcomed. Thanks,

    Read the article

  • Why do C compilers prepend underscores to external names?

    - by Michael Burr
    I've been working in C for so long that the fact that compilers typically add an underscore to the start of an extern is just understood... However, another SO question today got me wondering about the real reason why the underscore is added. A wikipedia article claims that a reason is: It was common practice for C compilers to prepend a leading underscore to all external scope program identifiers to avert clashes with contributions from runtime language support I think there's at least a kernel of truth to this, but also it seems to no really answer the question, since if the underscore is added to all externs it won't help much with preventing clashes. Does anyone have good information on the rationale for the leading underscore? Is the added underscore part of the reason that the Unix creat() system call doesn't end with an 'e'? I've heard that early linkers on some platforms had a limit of 6 characters for names. If that's the case, then prepending an underscore to external names would seem to be a downright crazy idea (now I only have 5 characters to play with...).

    Read the article

  • In which order I had to simplify this Boolean expression?

    - by user3662105
    I have to simplify this Boolean expression but i quite found it difficult since don't know in which order had to start with. the expression is: (x iff y) or (y iff z) if x found it little complicated to simplify since i don't know the order do I have to write it like: ((x iff y) or (y iff z)) if x or: (x iff y) or ((y iff z) if x) if you can give me a spot on this section of Boolean algebra. and I really will appreciate it if you gave me steps on how to simplify it. and had to say that I already tried a lot, solved a lot, tried wolfram alpha and others too even compared the results using truth tables but get different results and didn't know the right one from them. thanks in advance for your helping

    Read the article

  • Cobol: science and fiction

    - by user847
    There are a few threads about the relevance of the Cobol programming language on this forum, e.g. this thread links to a collection of them. What I am interested in here is a frequently repeated claim based on a study by Gartner from 1997: that there were around 200 billion lines of code in active use at that time! I would like to ask some questions to verify or falsify a couple of related points. My goal is to understand if this statement has any truth to it or if it is totally unrealistic. I apologize in advance for being a little verbose in presenting my line of thought and my own opinion on the things I am not sure about, but I think it might help to put things in context and thus highlight any wrong assumptions and conclusions I have made. Sometimes, the "200 billion lines" number is accompanied by the added claim that this corresponded to 80% of all programming code in any language in active use. Other times, the 80% merely refer to so-called "business code" (or some other vague phrase hinting that the reader is not to count mainstream software, embedded systems or anything else where Cobol is practically non-existent). In the following I assume that the code does not include double-counting of multiple installations of the same software (since that is cheating!). In particular in the time prior to the y2k problem, it has been noted that a lot of Cobol code is already 20 to 30 years old. That would mean it was written in the late 60ies and 70ies. At that time, the market leader was IBM with the IBM/370 mainframe. IBM has put up a historical announcement on his website quoting prices and availability. According to the sheet, prices are about one million dollars for machines with up to half a megabyte of memory. Question 1: How many mainframes have actually been sold? I have not found any numbers for those times; the latest numbers are for the year 2000, again by Gartner. :^( I would guess that the actual number is in the hundreds or the low thousands; if the market size was 50 billion in 2000 and the market has grown exponentially like any other technology, it might have been merely a few billions back in 1970. Since the IBM/370 was sold for twenty years, twenty times a few thousand will result in a couple of ten-thousands of machines (and that is pretty optimistic)! Question 2: How large were the programs in lines of code? I don't know how many bytes of machine code result from one line of source code on that architecture. But since the IBM/370 was a 32-bit machine, any address access must have used 4 bytes plus instruction (2, maybe 3 bytes for that?). If you count in operating system and data for the program, how many lines of code would have fit into the main memory of half a megabyte? Question 3: Was there no standard software? Did every single machine sold run a unique hand-coded system without any standard software? Seriously, even if every machine was programmed from scratch without any reuse of legacy code (wait ... didn't that violate one of the claims we started from to begin with???) we might have O(50,000 l.o.c./machine) * O(20,000 machines) = O(1,000,000,000 l.o.c.). That is still far, far, far away from 200 billion! Am I missing something obvious here? Question 4: How many programmers did we need to write 200 billion lines of code? I am really not sure about this one, but if we take an average of 10 l.o.c. per day, we would need 55 million man-years to achieve this! In the time-frame of 20 to 30 years this would mean that there must have existed two to three million programmers constantly writing, testing, debugging and documenting code. That would be about as many programmers as we have in China today, wouldn't it? Question 5: What about the competition? So far, I have come up with two things here: 1) IBM had their own programming language, PL/I. Above I have assumed that the majority of code has been written exclusively using Cobol. However, all other things being equal I wonder if IBM marketing had really pushed their own development off the market in favor of Cobol on their machines. Was there really no relevant code base of PL/I? 2) Sometimes (also on this board in the thread quoted above) I come across the claim that the "200 billion lines of code" are simply invisible to anybody outside of "governments, banks ..." (and whatnot). Actually, the DoD had funded their own language in order to increase cost effectiveness and reduce the proliferation of programming language. This lead to their use of Ada. Would they really worry about having so many different programming languages if they had predominantly used Cobol? If there was any language running on "government and military" systems outside the perception of mainstream computing, wouldn't that language be Ada? I hope someone can point out any flaws in my assumptions and/or conclusions and shed some light on whether the above claim has any truth to it or not.

    Read the article

  • Is java.util.Scanner that slow?

    - by Cristian Vrabie
    Hi guys, In a Android application I want to use Scanner class to read a list of floats from a text file (it's a list of vertex coordinates for OpenGL). Exact code is: Scanner in = new Scanner(new BufferedInputStream(getAssets().open("vertexes.off"))); final float[] vertexes = new float[nrVertexes]; for(int i=0;i<nrVertexFloats;i++){ vertexes[i] = in.nextFloat(); } It seems however that this is incredibly slow (it took 30 minutes to read 10,000 floats!) - as tested on the 2.1 emulator. What's going on? I don't remember Scanner to be that slow when I used it on the PC (truth be told I never read more than 100 values before). Or is it something else, like reading from an asset input stream? Thanks for the help!

    Read the article

  • PagedDataSource does not support serialization - how can I enforce this ?

    - by Darkyo
    Sounds like I want to override a physics law, but at least it is the most reasonnable solution, cpu / HDD and Ram effective for my asp.net project. In fact, I got a pageddataSource and a customDataReader that supports paginated data. The truth is my data are in a viewstate variable, because it is re-used in an update panel. When I intend to use it into my pageddatasource, asp.net 3.5 kills me with a System.Web.UI.WebControls.PagedDataSource' in Assembly 'System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' is not marked as serializable. cool exception... So I'd rather not offend newton because I know he'll always win, but I would need some help to enforce this pagedDataSource law, that seems so unbelievable, except if someone has an explanation.

    Read the article

  • What is the best answer to give for, "Why do you want to change from your present organization?"

    - by Techmaddy
    At present I am into a very good organization. I am planning to shift because I am not happy with the work that I am getting now. I want to work under a different Manager, but my Manager and team is more dependent on me. I tried so many times, but couldn't change my team. So, I started planning to switch my company. Everyone is asking the same question, "Why do you want to change?". Should I say the truth? I told this in 2 places, but did not see a good response from them. Or is there a better answer that I can give?

    Read the article

  • M2Crypto: Is PKey a reference to a Public or a Private key?

    - by Andrea Zilio
    In the PKey class documentation of the M2Crypto python package (an OpenSSL wrapper for Python) it is said that PKey is a reference to a Public key. My opinion is instead that it's a reference to a Private Key because the init method of the PKey class calls the evp_pkey_new openssl function that, from this link: http://linux.die.net/man/3/evp_pkey_new , should allocate a new reference to a private key structure! There are two only possible explaination: The M2Crypto documentation is wrong or the link I've reported has wrong informations. Can someone help me to find the truth?

    Read the article

  • Ambiguous Evaluation of Lambda Expression on Array

    - by Joe
    I would like to use a lambda that adds one to x if x is equal to zero. I have tried the following expressions: t = map(lambda x: x+1 if x==0 else x, numpy.array) t = map(lambda x: x==0 and x+1 or x, numpy.array) t = numpy.apply_along_axis(lambda x: x+1 if x==0 else x, 0, numpy.array) Each of these expressions returns the following error: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() My understanding of map() and numpy.apply_along_axis() was that it would take some function and apply it to each value of an array. From the error it seems that the the lambda is being evaluated as x=array, not some value in array. What am I doing wrong? I know that I could write a function to accomplish this but I want to become more familiar with the functional programming aspects of python.

    Read the article

  • How Likely Is It That I'll Get Sued Developing Software?

    - by yar
    It has been a practically unanimous truth on StackOverflow that if you work as an independent consultant, you should probably form a corporation (as seen here), to limit personal liability, supposedly to protect you in case of lawsuit. It seems to me that developing software does not result in many lawsuits, but this is an empirical (objective [and not community wiki]) question: How likely is it that a lone software developer will be sued? Also, by whom (a disgruntled company, coworker)? Since incorporating is basically taking out insurance, the likelihood of catastrophe needs to be taken into account. Also, aren't there standard laws covering, for example, total screw-ups with corporate data that mean that protect the lone cowboy/girl/person/coder?

    Read the article

  • Suggestions for a good IDE for DB2

    - by ken
    Hi all, I know, I know... it's a horrible fate but I am forced to work in an environment with DB2 on the back end. OK just kidding but the truth is I do like MSSQL's data studio a lot, and well IBM's tool is sorda crummy in my opinion... I was using the free version of Toad but I just got a new 64bit machine which is nice and all but there isn't a free version of Toad that I can find for win7 64. Does anyone have any suggestions for a good IDE to use with DB2? Being a developer I really just do a lot of looking at the DB structure and querying to see what I get back and how I want to get things back etc... Thanks for any advice!

    Read the article

  • Passing operator as a parameter

    - by nacho4d
    Hi, I want to have a function that evaluates 2 bool vars (like a truth table) for example: since T | F : T then myfunc('t', 'f', ||); /*defined as: bool myfunc(char lv, char rv, ????)*/ should return true; how can I pass the third parameter? (I know is possible to pass it as a char* but then I will have to have another table to compare operator string and then do the operation which is something I would like to avoid) Is it possible to pass an operator like ^(XOR) or ||(OR) or &&(AND), etc in a function/method? Thanks in advance

    Read the article

  • Does anyone else think instance variables are problematic in database-backed applications?

    - by Ben Aston
    It occurs to me that state control in languages like C# is not well supported. By this, I mean, it is left upto the programmer to manage the state of in-memory objects. A common use-case is that instance variables in the domain-model are copies of information residing in persistent storage (i.e. the database). Clearly this violates the single point of authority principle, and "synchronisation" has to be managed by the developer. I envisage a system where instead of instance variables, we have simple public access/mutator methods marked with attributes that link them to the database, and where reads and writes are mediated by a framework that decides whether to hit the database. Does such a system exist? Am I completely missing the point, or is there some truth to this idea?

    Read the article

  • What are the latest options in Java logging frameworks?

    - by sanity
    This question gets asked periodically, but I've long felt that existing Java logging frameworks were overcomplicated and over-engineered, and I want to see what's new. I have a more critical issue on my current project as we've standardized on JSON as our human-readable data encoding, and most logging frameworks I've seen require XML. I would really rather avoid using JSON for 95% of my apps configuration, and XML for the rest just because of the logging framework (truth be told, I hate XML used for anything other than text markup, its original intended purpose). Are there any hot new Java logging frameworks that are actively maintained, reasonably powerful, have a maven repo, can be reconfigured without restarting your app, and don't tie you to XML?

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >