Search Results

Search found 27637 results on 1106 pages for 'source packages'.

Page 39/1106 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Why can't I set the Image Source in WPF app?

    - by elmonty
    I'm just learning WPF. I'm using VS 2008. I've added an Image control to my XAML, and I added a GIF file to the project. If I use the XAML Designer's property panel to set the image's Source property, the drop-down list for the Source property contains the following: C:\Sample Project;component/pinHorizontal.gif There are several problems with this: My project, named "Sample Project," is not in the root of my drive. Why is ";component" in there? If I select this value as given, I get the error "Property value is not valid" (yeah, no kidding). If I go into the XAML source and set the Source property manually, like so: The XAML Designer gives me this error: "The file pinHorizontal.gif is not part of the project or its 'Build Action' property is not set to 'Resource'." Why is this task so difficult? How do I assign an image source?

    Read the article

  • What should I really release as open source when I use Berkeley DB (JE)?

    - by Ersin Er
    The Berkeley DB (JE) license information says: "...Redistributions in any form must be accompanied by information on how to obtain complete source code for the DB software and any accompanying software that uses the DB software.... For an executable file, complete source code means the source code for all modules it contains. It does not include source code for modules or files that typically accompany the major components of the operating system on which the executable file runs." If I build an abstraction level on top of BDB JE, should I open source only that library or any software that uses BDB JE indirectly?

    Read the article

  • Unsigned lenny packages with aptitude safe-upgrade

    - by Liam
    I have several Debian lenny computers. Two have nearly identical sources.list files. On both, I do regular update/safe-upgrades. On one it always goes smoothly. On the other, much of the time I get the following: sudo aptitude safe-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Reading extended state information Initializing package states... Done Reading task descriptions... Done The following packages will be upgraded: krb5-clients krb5-ftpd krb5-rsh-server krb5-telnetd krb5-user libimlib2 libkadm55 libkrb53 libpng12-0 libpulse0 xpdf xpdf-common xpdf-reader 13 packages upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Need to get 2906kB of archives. After unpacking 36.9kB will be used. Do you want to continue? [Y/n/?] WARNING: untrusted versions of the following packages will be installed! Untrusted packages could compromise your system's security. You should only proceed with the installation if you are certain that this is what you want to do. krb5-rsh-server krb5-user krb5-ftpd krb5-clients libkrb53 xpdf-reader libpng12-0 libkadm55 xpdf libpulse0 libimlib2 krb5-telnetd xpdf-common Do you want to ignore this warning and proceed anyway? To continue, enter "Yes"; to abort, enter "No": no Abort. Needless to say, I don't proceed. What is going on? How do I fix it? These are the non-comment lines in the sources.list for this computer: deb ftp://ftp.debian.org/debian/ lenny main contrib non-free deb-src ftp://ftp.debian.org/debian/ lenny main contrib deb http://security.debian.org/ lenny/updates main contrib non-free Thank you.

    Read the article

  • Source-control 'wet-work'?

    - by Phil Factor
    When a design or creative work is flawed beyond remedy, it is often best to destroy it and start again. The other day, I lost the code to a long and intricate SQL batch I was working on. I’d thought it was impossible, but it happened. With all the technology around that is designed to prevent this occurring, this sort of accident has become a rare event.  If it weren’t for a deranged laptop, and my distraction, the code wouldn’t have been lost this time.  As always, I sighed, had a soothing cup of tea, and typed it all in again.  The new code I hastily tapped in  was much better: I’d held in my head the essence of how the code should work rather than the details: I now knew for certain  the start point, the end, and how it should be achieved. Instantly the detritus of half-baked thoughts fell away and I was able to write logical code that performed better.  Because I could work so quickly, I was able to hold the details of all the columns and variables in my head, and the dynamics of the flow of data. It was, in fact, easier and quicker to start from scratch rather than tidy up and refactor the existing code with its inevitable fumbling and half-baked ideas. What a shame that technology is now so good that developers rarely experience the cleansing shock of losing one’s code and having to rewrite it from scratch.  If you’ve never accidentally lost  your code, then it is worth doing it deliberately once for the experience. Creative people have, until Technology mistakenly prevented it, torn up their drafts or sketches, threw them in the bin, and started again from scratch.  Leonardo’s obsessive reworking of the Mona Lisa was renowned because it was so unusual:  Most artists have been utterly ruthless in destroying work that didn’t quite make it. Authors are particularly keen on writing afresh, and the results are generally positive. Lawrence of Arabia actually lost the entire 250,000 word manuscript of ‘The Seven Pillars of Wisdom’ by accidentally leaving it on a train at Reading station, before rewriting a much better version.  Now, any writer or artist is seduced by technology into altering or refining their work rather than casting it dramatically in the bin or setting a light to it on a bonfire, and rewriting it from the blank page.  It is easy to pick away at a flawed work, but the real creative process is far more brutal. Once, many years ago whilst running a software house that supplied commercial software to local businesses, I’d been supervising an accounting system for a farming cooperative. No packaged system met their needs, and it was all hand-cut code.  For us, it represented a breakthrough as it was for a government organisation, and success would guarantee more contracts. As you’ve probably guessed, the code got mangled in a disk crash just a week before the deadline for delivery, and the many backups all proved to be entirely corrupted by a faulty tape drive.  There were some fragments left on individual machines, but they were all of different versions.  The developers were in despair.  Strangely, I managed to re-write the bulk of a three-month project in a manic and caffeine-soaked weekend.  Sure, that elegant universally-applicable input-form routine was‘nt quite so elegant, but it didn’t really need to be as we knew what forms it needed to support.  Yes, the code lacked architectural elegance and reusability. By dawn on Monday, the application passed its integration tests. The developers rose to the occasion after I’d collapsed, and tidied up what I’d done, though they were reproachful that some of the style and elegance had gone out of the application. By the delivery date, we were able to install it. It was a smaller, faster application than the beta they’d seen and the user-interface had a new, rather Spartan, appearance that we swore was done to conform to the latest in user-interface guidelines. (we switched to Helvetica font to look more ‘Bauhaus’ ). The client was so delighted that he forgave the new bugs that had crept in. I still have the disk that crashed, up in the attic. In IT, we have had mixed experiences from complete re-writes. Lotus 123 never really recovered from a complete rewrite from assembler into C, Borland made the mistake with Arago and Quattro Pro  and Netscape’s complete rewrite of their Navigator 4 browser was a white-knuckle ride. In all cases, the decision to rewrite was a result of extreme circumstances where no other course of action seemed possible.   The rewrite didn’t come out of the blue. I prefer to remember the rewrite of Minix by young Linus Torvalds, or the rewrite of Bitkeeper by a slightly older Linus.  The rewrite of CP/M didn’t do too badly either, did it? Come to think of it, the guy who decided to rewrite the windowing system of the Xerox Star never regretted the decision. I’ll agree that one should often resist calls for a rewrite. One of the worst habits of the more inexperienced programmer is to denigrate whatever code he or she inherits, and then call loudly for a complete rewrite. They are buoyed up by the mistaken belief that they can do better. This, however, is a different psychological phenomenon, more related to the idea of some motorcyclists that they are operating on infinite lives, or the occasional squaddies that if they charge the machine-guns determinedly enough all will be well. Grim experience brings out the humility in any experienced programmer.  I’m referring to quite different circumstances here. Where a team knows the requirements perfectly, are of one mind on methodology and coding standards, and they already have a solution, then what is wrong with considering  a complete rewrite? Rewrites are so painful in the early stages, until that point where one realises the payoff, that even I quail at the thought. One needs a natural disaster to push one over the edge. The trouble is that source-control systems, and disaster recovery systems, are just too good nowadays.   If I were to lose this draft of this very blog post, I know I’d rewrite it much better. However, if you read this, you’ll know I didn’t have the nerve to delete it and start again.  There was a time that one prayed that unreliable hardware would deliver you from an unmaintainable mess of a codebase, but now technology has made us almost entirely immune to such a merciful act of God. An old friend of mine with long experience in the software industry has long had the idea of the ‘source-control wet-work’,  where one hires a malicious hacker in some wild eastern country to hack into one’s own  source control system to destroy all trace of the source to an application. Alas, backup systems are just too good to make this any more than a pipedream. Somehow, it would be difficult to promote the idea. As an alternative, could one construct a source control system that, on doing all the code-quality metrics, would systematically destroy all trace of source code that failed the quality test? Alas, I can’t see many managers buying into the idea. In reading the full story of the near-loss of Toy Story 2, it set me thinking. It turned out that the lucky restoration of the code wasn’t the happy ending one first imagined it to be, because they eventually came to the conclusion that the plot was fundamentally flawed and it all had to be rewritten anyway.  Was this an early  case of the ‘source-control wet-job’?’ It is very hard nowadays to do a rapid U-turn in a development project because we are far too prone to cling to our existing source-code.

    Read the article

  • Source-control 'wet-work'?

    - by Phil Factor
    When a design or creative work is flawed beyond remedy, it is often best to destroy it and start again. The other day, I lost the code to a long and intricate SQL batch I was working on. I’d thought it was impossible, but it happened. With all the technology around that is designed to prevent this occurring, this sort of accident has become a rare event.  If it weren’t for a deranged laptop, and my distraction, the code wouldn’t have been lost this time.  As always, I sighed, had a soothing cup of tea, and typed it all in again.  The new code I hastily tapped in  was much better: I’d held in my head the essence of how the code should work rather than the details: I now knew for certain  the start point, the end, and how it should be achieved. Instantly the detritus of half-baked thoughts fell away and I was able to write logical code that performed better.  Because I could work so quickly, I was able to hold the details of all the columns and variables in my head, and the dynamics of the flow of data. It was, in fact, easier and quicker to start from scratch rather than tidy up and refactor the existing code with its inevitable fumbling and half-baked ideas. What a shame that technology is now so good that developers rarely experience the cleansing shock of losing one’s code and having to rewrite it from scratch.  If you’ve never accidentally lost  your code, then it is worth doing it deliberately once for the experience. Creative people have, until Technology mistakenly prevented it, torn up their drafts or sketches, threw them in the bin, and started again from scratch.  Leonardo’s obsessive reworking of the Mona Lisa was renowned because it was so unusual:  Most artists have been utterly ruthless in destroying work that didn’t quite make it. Authors are particularly keen on writing afresh, and the results are generally positive. Lawrence of Arabia actually lost the entire 250,000 word manuscript of ‘The Seven Pillars of Wisdom’ by accidentally leaving it on a train at Reading station, before rewriting a much better version.  Now, any writer or artist is seduced by technology into altering or refining their work rather than casting it dramatically in the bin or setting a light to it on a bonfire, and rewriting it from the blank page.  It is easy to pick away at a flawed work, but the real creative process is far more brutal. Once, many years ago whilst running a software house that supplied commercial software to local businesses, I’d been supervising an accounting system for a farming cooperative. No packaged system met their needs, and it was all hand-cut code.  For us, it represented a breakthrough as it was for a government organisation, and success would guarantee more contracts. As you’ve probably guessed, the code got mangled in a disk crash just a week before the deadline for delivery, and the many backups all proved to be entirely corrupted by a faulty tape drive.  There were some fragments left on individual machines, but they were all of different versions.  The developers were in despair.  Strangely, I managed to re-write the bulk of a three-month project in a manic and caffeine-soaked weekend.  Sure, that elegant universally-applicable input-form routine was‘nt quite so elegant, but it didn’t really need to be as we knew what forms it needed to support.  Yes, the code lacked architectural elegance and reusability. By dawn on Monday, the application passed its integration tests. The developers rose to the occasion after I’d collapsed, and tidied up what I’d done, though they were reproachful that some of the style and elegance had gone out of the application. By the delivery date, we were able to install it. It was a smaller, faster application than the beta they’d seen and the user-interface had a new, rather Spartan, appearance that we swore was done to conform to the latest in user-interface guidelines. (we switched to Helvetica font to look more ‘Bauhaus’ ). The client was so delighted that he forgave the new bugs that had crept in. I still have the disk that crashed, up in the attic. In IT, we have had mixed experiences from complete re-writes. Lotus 123 never really recovered from a complete rewrite from assembler into C, Borland made the mistake with Arago and Quattro Pro  and Netscape’s complete rewrite of their Navigator 4 browser was a white-knuckle ride. In all cases, the decision to rewrite was a result of extreme circumstances where no other course of action seemed possible.   The rewrite didn’t come out of the blue. I prefer to remember the rewrite of Minix by young Linus Torvalds, or the rewrite of Bitkeeper by a slightly older Linus.  The rewrite of CP/M didn’t do too badly either, did it? Come to think of it, the guy who decided to rewrite the windowing system of the Xerox Star never regretted the decision. I’ll agree that one should often resist calls for a rewrite. One of the worst habits of the more inexperienced programmer is to denigrate whatever code he or she inherits, and then call loudly for a complete rewrite. They are buoyed up by the mistaken belief that they can do better. This, however, is a different psychological phenomenon, more related to the idea of some motorcyclists that they are operating on infinite lives, or the occasional squaddies that if they charge the machine-guns determinedly enough all will be well. Grim experience brings out the humility in any experienced programmer.  I’m referring to quite different circumstances here. Where a team knows the requirements perfectly, are of one mind on methodology and coding standards, and they already have a solution, then what is wrong with considering  a complete rewrite? Rewrites are so painful in the early stages, until that point where one realises the payoff, that even I quail at the thought. One needs a natural disaster to push one over the edge. The trouble is that source-control systems, and disaster recovery systems, are just too good nowadays.   If I were to lose this draft of this very blog post, I know I’d rewrite it much better. However, if you read this, you’ll know I didn’t have the nerve to delete it and start again.  There was a time that one prayed that unreliable hardware would deliver you from an unmaintainable mess of a codebase, but now technology has made us almost entirely immune to such a merciful act of God. An old friend of mine with long experience in the software industry has long had the idea of the ‘source-control wet-work’,  where one hires a malicious hacker in some wild eastern country to hack into one’s own  source control system to destroy all trace of the source to an application. Alas, backup systems are just too good to make this any more than a pipedream. Somehow, it would be difficult to promote the idea. As an alternative, could one construct a source control system that, on doing all the code-quality metrics, would systematically destroy all trace of source code that failed the quality test? Alas, I can’t see many managers buying into the idea. In reading the full story of the near-loss of Toy Story 2, it set me thinking. It turned out that the lucky restoration of the code wasn’t the happy ending one first imagined it to be, because they eventually came to the conclusion that the plot was fundamentally flawed and it all had to be rewritten anyway.  Was this an early  case of the ‘source-control wet-job’?’ It is very hard nowadays to do a rapid U-turn in a development project because we are far too prone to cling to our existing source-code.

    Read the article

  • Data Source Connection Pool Sizing

    - by Steve Felts
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} One of the most time-consuming procedures of a database application is establishing a connection. The connection pooling of the data source can be used to minimize this overhead.  That argues for using the data source instead of accessing the database driver directly. Configuring the size of the pool in the data source is somewhere between an art and science – this article will try to move it closer to science.  From the beginning, WLS data source has had an initial capacity and a maximum capacity configuration values.  When the system starts up and when it shrinks, initial capacity is used.  The pool can grow to maximum capacity.  Customers found that they might want to set the initial capacity to 0 (more on that later) but didn’t want the pool to shrink to 0.  In WLS 10.3.6, we added minimum capacity to specify the lower limit to which a pool will shrink.  If minimum capacity is not set, it defaults to the initial capacity for upward compatibility.   We also did some work on the shrinking in release 10.3.4 to reduce thrashing; the algorithm that used to shrink to the maximum of the currently used connections or the initial capacity (basically the unused connections were all released) was changed to shrink by half of the unused connections. The simple approach to sizing the pool is to set the initial/minimum capacity to the maximum capacity.  Doing this creates all connections at startup, avoiding creating connections on demand and the pool is stable.  However, there are a number of reasons not to take this simple approach. When WLS is booted, the deployment of the data source includes synchronously creating the connections.  The more connections that are configured in initial capacity, the longer the boot time for WLS (there have been several projects for parallel boot in WLS but none that are available).  Related to creating a lot of connections at boot time is the problem of logon storms (the database gets too much work at one time).   WLS has a solution for that by setting the login delay seconds on the pool but that also increases the boot time. There are a number of cases where it is desirable to set the initial capacity to 0.  By doing that, the overhead of creating connections is deferred out of the boot and the database doesn’t need to be available.  An application may not want WLS to automatically connect to the database until it is actually needed, such as for some code/warm failover configurations. There are a number of cases where minimum capacity should be less than maximum capacity.  Connections are generally expensive to keep around.  They cause state to be kept on both the client and the server, and the state on the backend may be heavy (for example, a process).  Depending on the vendor, connection usage may cost money.  If work load is not constant, then database connections can be freed up by shrinking the pool when connections are not in use.  When using Active GridLink, connections can be created as needed according to runtime load balancing (RLB) percentages instead of by connection load balancing (CLB) during data source deployment. Shrinking is an effective technique for clearing the pool when connections are not in use.  In addition to the obvious reason that there times where the workload is lighter,  there are some configurations where the database and/or firewall conspire to make long-unused or too-old connections no longer viable.  There are also some data source features where the connection has state and cannot be used again unless the state matches the request.  Examples of this are identity based pooling where the connection has a particular owner and XA affinity where the connection is associated with a particular RAC node.  At this point, WLS does not re-purpose (discard/replace) connections and shrinking is a way to get rid of the unused existing connection and get a new one with the correct state when needed. So far, the discussion has focused on the relationship of initial, minimum, and maximum capacity.  Computing the maximum size requires some knowledge about the application and the current number of simultaneously active users, web sessions, batch programs, or whatever access patterns are common.  The applications should be written to only reserve and close connections as needed but multiple statements, if needed, should be done in one reservation (don’t get/close more often than necessary).  This means that the size of the pool is likely to be significantly smaller then the number of users.   If possible, you can pick a size and see how it performs under simulated or real load.  There is a high-water mark statistic (ActiveConnectionsHighCount) that tracks the maximum connections concurrently used.  In general, you want the size to be big enough so that you never run out of connections but no bigger.   It will need to deal with spikes in usage, which is where shrinking after the spike is important.  Of course, the database capacity also has a big influence on the decision since it’s important not to overload the database machine.  Planning also needs to happen if you are running in a Multi-Data Source or Active GridLink configuration and expect that the remaining nodes will take over the connections when one of the nodes in the cluster goes down.  For XA affinity, additional headroom is also recommended.  In summary, setting initial and maximum capacity to be the same may be simple but there are many other factors that may be important in making the decision about sizing.

    Read the article

  • Where can I collaborate with my friend on source code in real time?

    - by Carson Myers
    I mean, other than a conference room :) Using google docs, I can upload any kind of file and view it with other people, watch them edit it in real time, with a live chat happening in the same window. This is awesome. How can I do the same thing with source code? I'm looking for a web application where I can upload source files that will be displayed in some kind of editor, with syntax highlighting, and allow others to view it and edit it in real time. Preferably with a live chat also, but not necessary. Does anybody know where I can find this?

    Read the article

  • Where do you find the Bash source files of programs in Linux/Mac OSX?

    - by AndrewKS
    I am currently writing some convenience methods for my terminal in my bash_profile and am sure if what I am writing is "the best way". I figure a good way to verify whether what I'm doing is right or not would be to find some source code of more established programs and see how they do it. My question then is, where can I find this code on my Mac? An example is, with Macports installed, where is the source code that opens the port interactive console when I type nothing but "port" in my shell? (I added Linux in the title even though I am on a Mac because I assume the answer would be the same for both) Edit: The answer I am looking for is in terms of which directory relative to the programs will I find their unix scripts.

    Read the article

  • I want to make my Application Open Source, but I'm unsure...

    - by Joern
    Hi! I have written a website, which is my big idea. Now I'm planning a version for intranets. The big question in my head is: "shall I make it open source?" and next are: What do I have to prepare What do I need for a good start What is the best way to find people with the same interests Who gives me the proof that I won't find a copy of my idea on another page, stolen from my open source code Please Discuss with me, give some good arguments pro and contra. Yours, Joern.

    Read the article

  • How to choose an open source, Asterisk friendly firewall?

    - by Lucas
    I'm in pain. We are moving to a SIP based VOIP system and for whatever reason, we could not get our hosted Asterisk solution to work with our Sonicwall. Our VOIP provider gave up and is recommending an open source vendor, pfSense. A little background: We have about 30 users in our network. We use a few IPSec VPN connections for remote networks. I would like, but don't need, application layer filtering. We're active internet users, so properly traffic shaping is probably a concern. How can I tell if an open source firewall will handle VOIP setup smoothly with a hosted Asterisk system?

    Read the article

  • Deciding On Features For Open Source

    - by Robz / Fervent Coder
    Open source feature selection is subjective. An interesting question was posed to me recently at a presentation - “How do you decide what features to include in the [open source] projects you manage?” Is It Objective? I’d like to say that it’s really objective and that we vote on features and look at what carries the most interest of the populace. Actually no I wouldn’t. I don’t think I would enjoy working on open source (OSS) as much if it someone else decided on what features I should include. It already works that way at work. I don’t want to come home from work and work on things that others decide for me unless they are paying me for those features. So how do I decide on features for our open source projects? I think there are at least three paths to feature selection and they are not necessarily mutually exclusive. Feature Selection IS the Set of Features For the Domain Your product, in whatever domain it is in, needs to have the basic set of features that make it answer the needs of that domain. That is different for every product, but if you take for example a build tool, at the very least it needs to be able to compile source. And these basic needed features are not always objective either. Two people could completely disagree what makes for a required feature to meet a domain need for a product. Even one person may disagree with himself/herself about what features are needed based on different timeframes. So that leads us down to subjective. Feature Selection IS An Answer To Competition Some features go in because the competition adds a feature that may draw others away from your product offering. With OSS, there are all free alternatives, so if your competition adds a killer feature and you don’t, there isn’t much other than learning (how to use the other product) to move your customers off to the competition. If you want to keep your customers, you need to be ready to answer the questions of adding the features your competition has added.  Sometimes it’s about adding a feature that your competition charges for, but you add it for free. That draws people to the free alternative – so sometimes that adds a motivation to select a feature. Sometimes it’s because you want those features in your product, either to learn how you can answer the question of how to do something and/or because you have a need for that feature and you want it in your product. That also leads us down the road to subjective. Feature Selection IS Subjective I decide on features based on what I want to see in the product I am working on. Things I am interested in or have the biggest need for usually get picked first, with things that do not interest me either coming later or not at all. Most people get interested in an area of OSS because it solves a need for them and/or they find it interesting. If one of these two things is not happening and they are not being paid, it’s likely that person will move on to something else they find interesting or just stop OSS altogether. OSS feature selection is just that – subjective. If it wasn’t, it wouldn’t be opinionated and it wouldn’t have a personality about it. Most people like certain OSS because they like where the product is going or the personalities behind the product. For me, I want my products to be easy to use and solve an important problem. If it takes you more than 5-10 minutes to learn how to use my product, I know you are probably going somewhere else. So I pick features that make the product easy to use and learn, and those are not always the simplest features to work on. I work for conventions and make the product opinionated, because I think that is what makes using a product easier, if it already works with little setup. And I like to provide the ability for power users to get in and change the conventions to suit their needs. So those are required features for me above and beyond the domain features. I like to think I do a pretty good job at this. Usually when I present on something I’ve created, I like seeing people’s eyes light up when they see how simple it is to set up a powerful product like UppercuT. Patches And/Or Donations But remember before you say I’m a bad person or won’t use my product, I’ll always accept patches or I might like the feature that you suggest. If you like using the products I provide and they solve a problem for you the two biggest compliments you can provide are either a patch or a donation.  If you think the product is great, but if it could do this one other thing, it would be awesome(!), then consider contacting me and providing a patch, or consider contacting me with a donation and a request to put the feature in. And alternatively, if it’s a big feature, you could hire me to work on the product to make it even better. What If There Are Multiple Committers? In the question of multiple committers, I choose that someone always makes the ultimate decision to select whether a feature should be part of a product or not. But for other OSS project maybe this is not the case. If there is not an ultimate decision maker, then there is the possibility of either adding every feature suggested or having a deadlock on two conflicting features.   So let me pose this question. If you work on Open Source, how do you decide on what features to put in your open source projects? How do you decide what doesn’t belong? What do you do when there are conflicting features?

    Read the article

  • Sort Data in Windows Phone using Collection View Source

    - by psheriff
    When you write a Windows Phone application you will most likely consume data from a web service somewhere. If that service returns data to you in a sort order that you do not want, you have an easy alternative to sort the data without writing any C# or VB code. You use the built-in CollectionViewSource object in XAML to perform the sorting for you. This assumes that you can get the data into a collection that implements the IEnumerable or IList interfaces.For this example, I will be using a simple Product class with two properties, and a list of Product objects using the Generic List class. Try this out by creating a Product class as shown in the following code:public class Product {  public Product(int id, string name)   {    ProductId = id;    ProductName = name;  }  public int ProductId { get; set; }  public string ProductName { get; set; }}Create a collection class that initializes a property called DataCollection with some sample data as shown in the code below:public class Products : List<Product>{  public Products()  {    InitCollection();  }  public List<Product> DataCollection { get; set; }  List<Product> InitCollection()  {    DataCollection = new List<Product>();    DataCollection.Add(new Product(3,        "PDSA .NET Productivity Framework"));    DataCollection.Add(new Product(1,        "Haystack Code Generator for .NET"));    DataCollection.Add(new Product(2,        "Fundamentals of .NET eBook"));    return DataCollection;  }}Notice that the data added to the collection is not in any particular order. Create a Windows Phone page and add two XML namespaces to the Page.xmlns:scm="clr-namespace:System.ComponentModel;assembly=System.Windows"xmlns:local="clr-namespace:WPSortData"The 'local' namespace is an alias to the name of the project that you created (in this case WPSortData). The 'scm' namespace references the System.Windows.dll and is needed for the SortDescription class that you will use for sorting the data. Create a phone:PhoneApplicationPage.Resources section in your Windows Phone page that looks like the following:<phone:PhoneApplicationPage.Resources>  <local:Products x:Key="products" />  <CollectionViewSource x:Key="prodCollection"      Source="{Binding Source={StaticResource products},                       Path=DataCollection}">    <CollectionViewSource.SortDescriptions>      <scm:SortDescription PropertyName="ProductName"                           Direction="Ascending" />    </CollectionViewSource.SortDescriptions>  </CollectionViewSource></phone:PhoneApplicationPage.Resources>The first line of code in the resources section creates an instance of your Products class. The constructor of the Products class calls the InitCollection method which creates three Product objects and adds them to the DataCollection property of the Products class. Once the Products object is instantiated you now add a CollectionViewSource object in XAML using the Products object as the source of the data to this collection. A CollectionViewSource has a SortDescriptions collection that allows you to specify a set of SortDescription objects. Each object can set a PropertyName and a Direction property. As you see in the above code you set the PropertyName equal to the ProductName property of the Product object and tell it to sort in an Ascending direction.All you have to do now is to create a ListBox control and set its ItemsSource property to the CollectionViewSource object. The ListBox displays the data in sorted order by ProductName and you did not have to write any LINQ queries or write other code to sort the data!<ListBox    ItemsSource="{Binding Source={StaticResource prodCollection}}"   DisplayMemberPath="ProductName" />SummaryIn this blog post you learned that you can sort any data without having to change the source code of where the data comes from. Simply feed the data into a CollectionViewSource in XAML and set some sort descriptions in XAML and the rest is done for you! This comes in very handy when you are consuming data from a source where the data is given to you and you do not have control over the sorting.NOTE: You can download this article and many samples like the one shown in this blog entry at my website. http://www.pdsa.com/downloads. Select “Tips and Tricks”, then “Sort Data in Windows Phone using Collection View Source” from the drop down list.Good Luck with your Coding,Paul Sheriff** SPECIAL OFFER FOR MY BLOG READERS **We frequently offer a FREE gift for readers of my blog. Visit http://www.pdsa.com/Event/Blog for your FREE gift!

    Read the article

  • Database Partitioning and Multiple Data Source Considerations

    - by Jeffrey McDaniel
    With the release of P6 Reporting Database 3.0 partitioning was added as a feature to help with performance and data management.  Careful investigation of requirements should be conducting prior to installation to help improve overall performance throughout the lifecycle of the data warehouse, preventing future maintenance that would result in data loss. Before installation try to determine how many data sources and partitions will be required along with the ranges.  In P6 Reporting Database 3.0 any adjustments outside of defaults must be made in the scripts and changes will require new ETL runs for each data source.  Considerations: 1. Standard Edition or Enterprise Edition of Oracle Database.   If you aren't using Oracle Enterprise Edition Database; the partitioning feature is not available. Multiple Data sources are only supported on Enterprise Edition of Oracle   Database. 2. Number of Data source Ids for partitioning during configuration.   This setting will specify how many partitions will be allocated for tables containing data source information.  This setting requires some evaluation prior to installation as       there are repercussions if you don't estimate correctly.   For example, if you configured the software for only 2 data sources and the partition setting was set to 2, however along came a 3rd data source.  The necessary steps to  accommodate this change are as follows: a) By default, 3 partitions are configured in the Reporting Database scripts. Edit the create_star_tables_part.sql script located in <installation directory>\star\scripts   and search for partition.  You’ll see P1, P2, P3.  Add additional partitions and sub-partitions for P4 and so on. These will appear in several areas.  (See P6 Reporting Database 3.0 Installation and Configuration guide for more information on this and how to adjust partition ranges). b) Run starETL -r.  This will recreate each table with the new partition key.  The effect of this step is that all tables data will be lost except for history related tables.   c) Run starETL for each of the 3 data sources (with the data source # (starETL.bat "-s2" -as defined in P6 Reporting Database 3.0 Installation and Configuration guide) The best strategy for this setting is to overestimate based on possible growth.  If during implementation it is deemed that there are atleast 2 data sources with possibility for growth, it is a better idea to set this setting to 4 or 5, allowing room for the future and preventing a ‘start over’ scenario. 3. The Number of Partitions and the Number of Months per Partitions are not specific to multi-data source.  These settings work in accordance to a sub partition of larger tables with regard to time related data.  These settings are dataset specific for optimization.  The number of months per partition is self explanatory, optimally the smaller the partition, the better query performance so if the dataset has an extremely large number of spread/history records, a lower number of months is optimal.  Working in accordance with this setting is the number of partitions, this will determine how many "buckets" will be created per the number of months setting.  For example, if you kept the default for # of partitions of 3, and select 2 months for each partitions you would end up with: -1st partition, 2 months -2nd partition, 2 months -3rd partition, all the remaining records Therefore with records to this setting, it is important to analyze your source db spread ranges and history settings when determining the proper number of months per partition and number of partitions to optimize performance.  Also be aware the DBA will need to monitor when these partition ranges will fill up and when additional partitions will need to be added.  If you get to the final range partition and there are no additional range partitions all data will be included into the last partition. 

    Read the article

  • java.lang.NoClassDefFoundError in command line

    - by Graham
    Hi, I'm developing an application in Eclipse and it runs fine from within Eclipse. The problem I'm having is that when I export it to a jar file and run it from the command line I get a NoClassDefFound error for javax.mail.internet. In both my project build path and class path I have included the activation.jar and mail.jar libraries required for me to use javax.mail.internet, and like I said it works fine from within Eclipse but not when I export it to a jar. If my build path has those files and so does my class path why would this not be working? Here is the error stack: Exception in thread "AWT-EventQueue-0" java.lang.NoClassDefFoundError: javax/mai l/internet/InternetAddress at airit.com.Auxiliary.validateEmail(Auxiliary.java:29) at airit.com.MainFrame.actionPerformed(MainFrame.java:79) at javax.swing.AbstractButton.fireActionPerformed(Unknown Source) at javax.swing.AbstractButton$Handler.actionPerformed(Unknown Source) at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source) at javax.swing.DefaultButtonModel.setPressed(Unknown Source) at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Sour ce) at java.awt.Component.processMouseEvent(Unknown Source) at javax.swing.JComponent.processMouseEvent(Unknown Source) at java.awt.Component.processEvent(Unknown Source) at java.awt.Container.processEvent(Unknown Source) at java.awt.Component.dispatchEventImpl(Unknown Source) at java.awt.Container.dispatchEventImpl(Unknown Source) at java.awt.Component.dispatchEvent(Unknown Source) at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source) at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source) at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source) at java.awt.Container.dispatchEventImpl(Unknown Source) at java.awt.Window.dispatchEventImpl(Unknown Source) at java.awt.Component.dispatchEvent(Unknown Source) at java.awt.EventQueue.dispatchEvent(Unknown Source) at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source) at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source) at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source) at java.awt.EventDispatchThread.pumpEvents(Unknown Source) at java.awt.EventDispatchThread.pumpEvents(Unknown Source) at java.awt.EventDispatchThread.run(Unknown Source) Caused by: java.lang.ClassNotFoundException: javax.mail.internet.InternetAddress at java.net.URLClassLoader$1.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) ... 27 more

    Read the article

  • how to display a JFrame from an applet?

    - by mithun1538
    Hello everyone, I have this class called PollFrame that extends JFrame in a file called PollFrame.java . PollFrame contains a form. I have an applet, which has a button in it. When the button is clicked, I want the PollFrame to be displayed. I set the ActionPerformed as: Pollframe poll = new PollFrame(); // This initializes the form poll.setVisible(true); However, when I click the button, I get the following error : Exception in thread "AWT-EventQueue-2" java.security.AccessControlException: access denied (java.lang.RuntimePermission exitVM.0) at java.security.AccessControlContext.checkPermission(Unknown Source) at java.security.AccessController.checkPermission(Unknown Source) at java.lang.SecurityManager.checkPermission(Unknown Source) at java.lang.SecurityManager.checkExit(Unknown Source) at javax.swing.JFrame.setDefaultCloseOperation(Unknown Source) at com.org.pollFrame.initComponents(pollFrame.java:54) at com.org.pollFrame.<init>(pollFrame.java:11) at com.org.EmployeeApplet.requestRoomActionPerformed(EmployeeApplet.java:216) at com.org.EmployeeApplet.access$300(EmployeeApplet.java:7) at com.org.EmployeeApplet$4.actionPerformed(EmployeeApplet.java:71) at javax.swing.AbstractButton.fireActionPerformed(Unknown Source) at javax.swing.AbstractButton$Handler.actionPerformed(Unknown Source) at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source) at javax.swing.DefaultButtonModel.setPressed(Unknown Source) at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source) at java.awt.Component.processMouseEvent(Unknown Source) at javax.swing.JComponent.processMouseEvent(Unknown Source) at java.awt.Component.processEvent(Unknown Source) at java.awt.Container.processEvent(Unknown Source) at java.awt.Component.dispatchEventImpl(Unknown Source) at java.awt.Container.dispatchEventImpl(Unknown Source) at java.awt.Component.dispatchEvent(Unknown Source) at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source) at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source) at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source) at java.awt.Container.dispatchEventImpl(Unknown Source) at java.awt.Component.dispatchEvent(Unknown Source) at java.awt.EventQueue.dispatchEvent(Unknown Source) at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source) at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source) at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source) at java.awt.EventDispatchThread.pumpEvents(Unknown Source) at java.awt.EventDispatchThread.pumpEvents(Unknown Source) at java.awt.EventDispatchThread.run(Unknown Source) I am guessing fromt he above error that calling another class file from an applet is prohibited. Is there any way I can display the PollFrame from the applet?

    Read the article

  • Installing packages in R path[1]="\file/users/bernard/R/win-library/2.15": Access is denied

    - by user1812210
    I recently attended an R training course and was happily working with a laptop in RStudio. On my return to the office I installed RStudio and I tried to run some scripts I had gathered from the course. However, these scripts required me to install packages and when I tried to install the packages the result was an error. Error in install.packages : path[1]="\file/users/bernard/R/win-library/2.15": Access is denied In my firm we write to a server drive refferred to as the U: drive rather than the hard disk on the desktop for security reasons. Any ideas what is going on? I have checked the properties of the folder in windows and it says I have permission but still it fails. I have tried creating a folder on the C drive and directing R_LIBS_USER to it but no luck.

    Read the article

  • Yum Update fails due to conflicting perl packages

    - by amRit
    Hi I have a CentOS 5.2 i386 Production box with Qmail, LAMP etc now since it showed me that it could upgrade to 5.3, i downloaded all the packages using yum update However, the update fails due to perl packages conflicting I believe that while installing Qmail, I had tried for Mailscanner and thus it upgraded the per itself I read somewhere that we can uninstall the conflicting rpms and then go for the upgrade But, I am about 50 % sure that if I go ahead and uninstall perl, my server will crash like a falling house of cards Also, perl is needed by variety of packages working including vpopmail etc Thus, I request anyone to please tell me if we have a safer solution to this IF there is no solution to this, then can somebody please teach me how to backup and restore Qmail as it is. I can handle the rest of the things I sincerely appreciate those who have read and even thought about my problem In anticipation of a solution Regards amRit

    Read the article

  • yum list installed including version of all installed packages CentOS 5.4

    - by Andy
    I have a list of packages installed with yum on CentOS 5.4 [root@server ~]# yum list installed ... Installed Packages GConf2.x86_64 2.14.0-9.el5 installed ImageMagick.x86_64 6.2.8.0-4.el5_1.1 installed MAKEDEV.x86_64 3.23-1.2 installed MySQL-python.x86_64 1.2.1-1 installed I would like to download these rpms locally using yumdownloader --resolve MySQL-python-1.2.1-1.x86_64 etc. However the package formatting is different (MySQL-python.x86_64 1.2.1-1 vs MySQL-python-1.2.1-1.x86_64) so I am unable to download them using the above command. I don't want to have to parse the output of yum list installed, and I also don't want to use the contents of /var/log/yum.log* as I'll have to account for erased packages and version discrepancies. However /var/log/yum.log* does have the formatting I require... May 25 14:58:15 Installed: groff-1.18.1.1-11.1.x86_64 May 25 14:58:15 Installed: bzip2-1.0.3-4.el5_2.x86_64 Any suggestions?

    Read the article

  • Downgrade 'local' packages in Debian/Ubuntu

    - by Matt Joiner
    I recently unticked the "pre-released updates" option in Software Sources on my Ubuntu Lucid 10.04.1 installation. The Ubuntu wiki states the following regarding this source: The proposed updates are updates which are waiting to be moved into the recommended updates queue after some testing. They may never reach recommended or they may be replaced with a more recent update. Roughly 20 installed packages have indeed not made it into recommended updates, and occasionally cause conflicts when I install new software, as related packages of the newer version are unavailable now that I've disabled the source. How can I force a downgrade of all packages for which an earlier version exists in a enabled repository?

    Read the article

  • View changelog of all packages to be upgraded before upgrading

    - by Stein G. Strindhaug
    When using synaptic on my Ubuntu desktop computer i can review all changelog of all the packages to be upgraded, and unselect a package for upgrade if I want. On my desktop I usually install everything, but I like to at least review what the changes are so that I can delay the upgrade if I suspect it could cause problems with the development tools I use. On a server (Ubuntu Server) with no x-server how can I do the same thing on the console: list all packages that will be updated (apt-get --dry-run upgrade does this along with a lot of noisy simulated install messages), view the changelog (if any) from last upgrade to the version it will be upgraded to. select which packages I want to ignore, or which I want to upgrade I've searched a lot for this but I haven't found anything, possibly I'm not using the correct terminology; but surely this must be possible. Synaptic must get it's info from some some low-level tool I assume? Complicated shellscripts are welcome too, if this is not already easily done with the existing tools.

    Read the article

  • How to resolve Unmet dependencies error?

    - by dandelion
    Using my new install of Ubuntu I haven't been able to download anything from the software center except the maryo game without the following error: The following packages have unmet dependencies: vlc: Depends: vlc-nox (= 1.1.12-2~oneiric1) but 1.1.12-2~oneiric1 is to be installed Depends: libaa1 (>= 1.4p5) but 1.4p5-38build1 is to be installed Depends: libavcodec-extra-53 (>= 4:0.7-1) but 4:0.7.3ubuntu0.11.10.1 is to be installed Depends: libavutil-extra-51 (>= 4:0.7-1) but 4:0.7.3ubuntu0.11.10.1 is to be installed Depends: libc6 (>= 2.8) but 2.13-20ubuntu5.1 is to be installed Depends: libfreetype6 (>= 2.2.1) but 2.4.4-2ubuntu1.1 is to be installed Depends: libgcc1 (>= 1:4.1.1) but 1:4.6.1-9ubuntu3 is to be installed Depends: libqtcore4 (>= 4:4.7.0~beta1) but 4:4.7.4-0ubuntu8.1 is to be installed Depends: libqtgui4 (>= 4:4.5.3) but 4:4.7.4-0ubuntu8.1 is to be installed Depends: libsdl-image1.2 (>= 1.2.10) but 1.2.10-2.1 is to be installed Depends: libsdl1.2debian (>= 1.2.10-1) but 1.2.14-6.1ubuntu4 is to be installed Depends: libstdc++6 (>= 4.6) but 4.6.1-9ubuntu3 is to be installed Depends: libva-x11-1 (> 1.0.12~) but it is not going to be installed Depends: libva1 (> 1.0.12~) but it is not going to be installed Depends: libxcb-randr0 (>= 1.1) but it is not going to be installed Depends: libxcb-xv0 (>= 1.2) but it is not going to be installed Depends: zlib1g (>= 1:1.2.3.3.dfsg) but 1:1.2.3.4.dfsg-3ubuntu3 is to be installed My system specs are version 11.10 64 bit. ge-g41m-es2l mother board amd 5770 video card wdc green 500 gig hard drive I have recently changed the motherboard, but otherwise have not changed my computer from when I used to be running the same version of Ubuntu. edit still unable to download output of sudo apt-get update output of sudo apt-get update ~$ sudo apt-get update Ign http://extras.ubuntu.com oneiric InRelease Ign http://security.ubuntu.com oneiric-security InRelease Ign http://archive.canonical.com oneiric InRelease Ign http://ppa.launchpad.net oneiric InRelease Ign http://us.archive.ubuntu.com oneiric InRelease Ign http://us.archive.ubuntu.com oneiric-updates InRelease Ign http://us.archive.ubuntu.com oneiric-backports InRelease Hit http://extras.ubuntu.com oneiric Release.gpg Hit http://archive.canonical.com oneiric Release.gpg Hit http://security.ubuntu.com oneiric-security Release.gpg Hit http://ppa.launchpad.net oneiric Release.gpg Ign http://us.archive.ubuntu.com oneiric-proposed InRelease Hit http://us.archive.ubuntu.com oneiric Release.gpg Hit http://extras.ubuntu.com oneiric Release Hit http://archive.canonical.com oneiric Release Hit http://security.ubuntu.com oneiric-security Release Hit http://ppa.launchpad.net oneiric Release Hit http://us.archive.ubuntu.com oneiric-updates Release.gpg Hit http://us.archive.ubuntu.com oneiric-backports Release.gpg Hit http://extras.ubuntu.com oneiric/main Sources Hit http://archive.canonical.com oneiric/partner i386 Packages Hit http://security.ubuntu.com oneiric-security/main Sources Hit http://ppa.launchpad.net oneiric/main Sources Hit http://us.archive.ubuntu.com oneiric-proposed Release.gpg Hit http://extras.ubuntu.com oneiric/main i386 Packages Ign http://extras.ubuntu.com oneiric/main TranslationIndex Hit http://ppa.launchpad.net oneiric/main i386 Packages Ign http://ppa.launchpad.net oneiric/main TranslationIndex Ign http://archive.canonical.com oneiric/partner TranslationIndex Hit http://security.ubuntu.com oneiric-security/restricted Sources Hit http://security.ubuntu.com oneiric-security/universe Sources Hit http://security.ubuntu.com oneiric-security/multiverse Sources Hit http://security.ubuntu.com oneiric-security/main i386 Packages Hit http://security.ubuntu.com oneiric-security/restricted i386 Packages Hit http://us.archive.ubuntu.com oneiric Release Hit http://us.archive.ubuntu.com oneiric-updates Release Hit http://security.ubuntu.com oneiric-security/universe i386 Packages Hit http://security.ubuntu.com oneiric-security/multiverse i386 Packages Hit http://security.ubuntu.com oneiric-security/main TranslationIndex Hit http://security.ubuntu.com oneiric-security/multiverse TranslationIndex Hit http://security.ubuntu.com oneiric-security/restricted TranslationIndex Hit http://security.ubuntu.com oneiric-security/universe TranslationIndex Hit http://us.archive.ubuntu.com oneiric-backports Release Hit http://security.ubuntu.com oneiric-security/main Translation-en Hit http://security.ubuntu.com oneiric-security/multiverse Translation-en Hit http://us.archive.ubuntu.com oneiric-proposed Release Hit http://us.archive.ubuntu.com oneiric/main Sources Hit http://us.archive.ubuntu.com oneiric/restricted Sources Hit http://us.archive.ubuntu.com oneiric/universe Sources Hit http://us.archive.ubuntu.com oneiric/multiverse Sources Hit http://security.ubuntu.com oneiric-security/restricted Translation-en Hit http://us.archive.ubuntu.com oneiric/restricted i386 Packages Hit http://us.archive.ubuntu.com oneiric/universe i386 Packages Hit http://us.archive.ubuntu.com oneiric/multiverse i386 Packages Hit http://us.archive.ubuntu.com oneiric/main TranslationIndex Hit http://us.archive.ubuntu.com oneiric/multiverse TranslationIndex Hit http://us.archive.ubuntu.com oneiric/restricted TranslationIndex Hit http://us.archive.ubuntu.com oneiric/universe TranslationIndex Hit http://us.archive.ubuntu.com oneiric-updates/main Sources Hit http://us.archive.ubuntu.com oneiric-updates/restricted Sources Hit http://security.ubuntu.com oneiric-security/universe Translation-en Hit http://us.archive.ubuntu.com oneiric-updates/universe Sources Hit http://us.archive.ubuntu.com oneiric-updates/multiverse Sources Hit http://us.archive.ubuntu.com oneiric-updates/main i386 Packages Hit http://us.archive.ubuntu.com oneiric-updates/restricted i386 Packages Hit http://us.archive.ubuntu.com oneiric-updates/universe i386 Packages Hit http://us.archive.ubuntu.com oneiric-updates/multiverse i386 Packages Hit http://us.archive.ubuntu.com oneiric-updates/main TranslationIndex Hit http://us.archive.ubuntu.com oneiric-updates/multiverse TranslationIndex Hit http://us.archive.ubuntu.com oneiric-updates/restricted TranslationIndex Hit http://us.archive.ubuntu.com oneiric-updates/universe TranslationIndex Hit http://us.archive.ubuntu.com oneiric-backports/main Sources Hit http://us.archive.ubuntu.com oneiric-backports/restricted Sources Hit http://us.archive.ubuntu.com oneiric-backports/universe Sources Hit http://us.archive.ubuntu.com oneiric-backports/multiverse Sources Hit http://us.archive.ubuntu.com oneiric-backports/main i386 Packages Hit http://us.archive.ubuntu.com oneiric-backports/restricted i386 Packages Hit http://us.archive.ubuntu.com oneiric-backports/universe i386 Packages Hit http://us.archive.ubuntu.com oneiric-backports/multiverse i386 Packages Hit http://us.archive.ubuntu.com oneiric-backports/main TranslationIndex Hit http://us.archive.ubuntu.com oneiric-backports/multiverse TranslationIndex Hit http://us.archive.ubuntu.com oneiric-backports/restricted TranslationIndex Hit http://us.archive.ubuntu.com oneiric-backports/universe TranslationIndex Ign http://extras.ubuntu.com oneiric/main Translation-en_US Ign http://ppa.launchpad.net oneiric/main Translation-en_US Hit http://us.archive.ubuntu.com oneiric-proposed/restricted i386 Packages Hit http://us.archive.ubuntu.com oneiric-proposed/main i386 Packages Hit http://us.archive.ubuntu.com oneiric-proposed/multiverse i386 Packages Hit http://us.archive.ubuntu.com oneiric-proposed/universe i386 Packages Hit http://us.archive.ubuntu.com oneiric-proposed/main TranslationIndex Hit http://us.archive.ubuntu.com oneiric-proposed/multiverse TranslationIndex Hit http://us.archive.ubuntu.com oneiric-proposed/restricted TranslationIndex Hit http://us.archive.ubuntu.com oneiric-proposed/universe TranslationIndex Ign http://archive.canonical.com oneiric/partner Translation-en_US Ign http://extras.ubuntu.com oneiric/main Translation-en Ign http://ppa.launchpad.net oneiric/main Translation-en Ign http://archive.canonical.com oneiric/partner Translation-en Get:1 http://us.archive.ubuntu.com oneiric/main i386 Packages [1,583 kB] Hit http://us.archive.ubuntu.com oneiric/main Translation-en Hit http://us.archive.ubuntu.com oneiric/multiverse Translation-en Hit http://us.archive.ubuntu.com oneiric/restricted Translation-en Hit http://us.archive.ubuntu.com oneiric/universe Translation-en Hit http://us.archive.ubuntu.com oneiric-updates/main Translation-en Hit http://us.archive.ubuntu.com oneiric-updates/multiverse Translation-en Hit http://us.archive.ubuntu.com oneiric-updates/restricted Translation-en Hit http://us.archive.ubuntu.com oneiric-updates/universe Translation-en Hit http://us.archive.ubuntu.com oneiric-backports/main Translation-en Hit http://us.archive.ubuntu.com oneiric-backports/multiverse Translation-en Hit http://us.archive.ubuntu.com oneiric-backports/restricted Translation-en Hit http://us.archive.ubuntu.com oneiric-backports/universe Translation-en Hit http://us.archive.ubuntu.com oneiric-proposed/main Translation-en Hit http://us.archive.ubuntu.com oneiric-proposed/multiverse Translation-en Hit http://us.archive.ubuntu.com oneiric-proposed/restricted Translation-en Hit http://us.archive.ubuntu.com oneiric-proposed/universe Translation-en Err http://us.archive.ubuntu.com oneiric/main i386 Packages 404 Not Found [IP: 91.189.92.179 80] Fetched 1 B in 2s (0 B/s) W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/oneiric/main/binary-i386/Packages 404 Not Found [IP: 91.189.92.179 80] E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Why kernel source is not installed

    - by Subhajit
    I want to install kernel source in Ubuntu 12.04 which is not installed. I have checked the same using the following command : dpkg -s kernel Output is Kernel is not installed no information available Hence I have followed the following steps to install the same: 1.Installed the Dependencies by the following command: sudo apt-get install gcc libncurses5-dev git-core kernel-package fakeroot build-essential sudo apt-get update && sudo apt-get upgrade Download kernel source by the following command: wget http://www.kernel.org/pub/linux/kernel/v3.x/linux-3.5.tar.bz2 tar -xvf linux-3.5.tar.bz2 cd linux-3.5/ 3.Compiled the source code to generate the .deb packages by the following command make-kpkg clean fakeroot make-kpkg --initrd --append-to-version=-spica kernel_image kernel_headers Install the .deb packages (two .deb packages generated, one is to install the kernel headers, another one is to install the kernel image) by the following command sudo dpkg -i linux-*.deb But after reboot it seems kernel is not installed (checked by dpkg -s kernel). Please tell where I am going wrong. Also, In step 3, I guess I am installing a new kernel (named spica kernel_image) but during the boot up this new kernel is not showing as option. Please help me

    Read the article

  • Images missing after moving Django to new server

    - by miszczu
    I'm moving Django project to new server. I'm newbie in Django, and I don't know where should be upload folder. There are all images which should be displayed on website. In config file I haven't seen upload folder I could specify, so I'm guessing it always should be the same location for django projects or I just can't find it. Locations are saved in database. When I've put uploaded files into media folder, so url was like domain.co.uk/media/upload/media/images/year/month/day/image_name.ext and the same is on the old website, images on website ware still missing. All images are visible if I put url by hand, but django doesn't seems to see files. Also I check django log file: 2012-05-30 09:13:33,393 ERROR render: Thumbnail tag failed: [in /usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/templatetags/thumbnail.py (line 49)] Traceback (most recent call last): File "/usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/templatetags/thumbnail.py", line 45, in render return self._render(context) File "/usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/templatetags/thumbnail.py", line 97, in _render file_, geometry, **options File "/usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/base.py", line 50, in get_thumbnail cached = default.kvstore.get(thumbnail) File "/usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/kvstores/base.py", line 25, in get return self._get(image_file.key) File "/usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/kvstores/base.py", line 123, in _get value = self._get_raw(add_prefix(key, identity)) File "/usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/kvstores/cached_db_kvstore.py", line 26, in _get_raw value = KVStoreModel.objects.get(key=key).value File "/usr/lib/python2.6/site-packages/django/db/models/manager.py", line 132, in get return self.get_query_set().get(*args, **kwargs) File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 344, in get num = len(clone) File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 82, in __len__ self._result_cache = list(self.iterator()) File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 273, in iterator for row in compiler.results_iter(): File "/usr/lib/python2.6/site-packages/django/db/models/sql/compiler.py", line 680, in results_iter for rows in self.execute_sql(MULTI): File "/usr/lib/python2.6/site-packages/django/db/models/sql/compiler.py", line 735, in execute_sql cursor.execute(sql, params) File "/usr/lib/python2.6/site-packages/django/db/backends/util.py", line 34, in execute return self.cursor.execute(sql, params) File "/usr/lib/python2.6/site-packages/django/db/backends/mysql/base.py", line 86, in execute return self.cursor.execute(query, args) File "/usr/lib64/python2.6/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.6/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue DatabaseError: (1146, "Table 'thumbnail_kvstore' doesn't exist") 2012-05-30 09:13:33,396 DEBUG execute: (0.000) SELECT `freetext_freetext`.`id`, `freetext_freetext`.`key`, `freetext_freetext`.`content`, `freetext_freetext`.`active` FROM `freetext_freetext` WHERE (`freetext_freetext`.`active` = True AND `freetext_freetext`.`key` = office-closed-message ); args=(True, u'office-closed-message') [in /usr/lib/python2.6/site-packages/django/db/backends/util.py (line 44)] 2012-05-30 09:13:33,399 DEBUG execute: (0.000) SELECT `menus_menu`.`id`, `menus_menu`.`name`, `menus_menu`.`slug`, `menus_menu`.`base_url`, `menus_menu`.`description`, `menus_menu`.`enabled` FROM `menus_menu` WHERE (`menus_menu`.`enabled` = True AND `menus_menu`.`slug` = about ); args=(True, u'about') [in /usr/lib/python2.6/site-packages/django/db/backends/util.py (line 44)] 2012-05-30 09:13:33,401 DEBUG execute: (0.000) SELECT `menus_menuitem`.`id`, `menus_menuitem`.`menu_id`, `menus_menuitem`.`title`, `menus_menuitem`.`url`, `menus_menuitem`.`order` FROM `menus_menuitem` INNER JOIN `menus_menu` ON (`menus_menuitem`.`menu_id` = `menus_menu`.`id`) WHERE `menus_menu`.`slug` = about ORDER BY `menus_menuitem`.`order` ASC; args=(u'about',) [in /usr/lib/python2.6/site-packages/django/db/backends/util.py (line 44)] 2012-05-30 09:13:33,404 DEBUG execute: (0.000) SELECT `freetext_freetext`.`id`, `freetext_freetext`.`key`, `freetext_freetext`.`content`, `freetext_freetext`.`active` FROM `freetext_freetext` WHERE (`freetext_freetext`.`active` = True AND `freetext_freetext`.`key` = contactdetails-footer ); args=(True, u'contactdetails-footer') [in /usr/lib/python2.6/site-packages/django/db/backends/util.py (line 44)] I checked database and there is no table calls thumbnail_kvstore, but I have database backup, and in backup files this table doesn't exist. All uploaded files I get are in media/uploads/media/. Also I'm getting errors on some pages: Syntax error. Expected: ``thumbnail source geometry [key1=val1 key2=val2...] as var`` /usr/lib/python2.6/site-packages/sorl_thumbnail-11.12-py2.6.egg/sorl/thumbnail/templatetags/thumbnail.py in __init__, line 72 In template /var/www/vhosts/domain.co.uk/sites/apps/shop/products/templates/products/product_detail.html, error at line 34 {% thumbnail image.file "800x700" detail as zoom %} Maybe some modules I install are not in the right version. Dont know how to fix it. Im using, CentOS 6, mod_wsgi, apache, python 2.6. Update 1.0: On the old server was Django 1.3, on the new one is Django 1.3.1 Update 1.1: I this i know where is the problem. I tried python manage.py syncdb and this is output: Syncing... Creating tables ... The following content types are stale and need to be deleted: orders | ordercontact Any objects related to these content types by a foreign key will also be deleted. Are you sure you want to delete these content types? If you're unsure, answer 'no'. Type 'yes' to continue, or 'no' to cancel: no Installing custom SQL ... Installing indexes ... No fixtures found. Synced: > django.contrib.auth > django.contrib.contenttypes > django.contrib.sessions > django.contrib.sites > django.contrib.messages > django.contrib.admin > django.contrib.admindocs > django.contrib.markup > django.contrib.sitemaps > django.contrib.redirects > django_filters > freetext > sorl.thumbnail > django_extensions > south > currencies > pagination > tagging > honeypot > core > faq > logentry > menus > news > shop > shop.cart > shop.orders Not synced (use migrations): - dbtemplates - contactform - links - media - pages - popularity - testimonials - shop.brands - shop.collections - shop.discount - shop.pricing - shop.product_types - shop.products - shop.shipping - shop.tax (use ./manage.py migrate to migrate these) Next I run python manage.py migrate, and thats what i get: Running migrations for dbtemplates: - Migrating forwards to 0002_auto__del_unique_template_name. > dbtemplates:0001_initial ! Error found during real run of migration! Aborting. ! Since you have a database that does not support running ! schema-altering statements in transactions, we have had ! to leave it in an interim state between migrations. ! You *might* be able to recover with: = DROP TABLE `django_template` CASCADE; [] = DROP TABLE `django_template_sites` CASCADE; [] ! The South developers regret this has happened, and would ! like to gently persuade you to consider a slightly ! easier-to-deal-with DBMS. ! NOTE: The error which caused the migration to fail is further up. Traceback (most recent call last): File "manage.py", line 13, in <module> execute_manager(settings) File "/usr/lib/python2.6/site-packages/django/core/management/__init__.py", line 438, in execute_manager utility.execute() File "/usr/lib/python2.6/site-packages/django/core/management/__init__.py", line 379, in execute self.fetch_command(subcommand).run_from_argv(self.argv) File "/usr/lib/python2.6/site-packages/django/core/management/base.py", line 191, in run_from_argv self.execute(*args, **options.__dict__) File "/usr/lib/python2.6/site-packages/django/core/management/base.py", line 220, in execute output = self.handle(*args, **options) File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/management/commands/migrate.py", line 105, in handle ignore_ghosts = ignore_ghosts, File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/__init__.py", line 191, in migrate_app success = migrator.migrate_many(target, workplan, database) File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/migrators.py", line 221, in migrate_many result = migrator.__class__.migrate_many(migrator, target, migrations, database) File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/migrators.py", line 292, in migrate_many result = self.migrate(migration, database) File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/migrators.py", line 125, in migrate result = self.run(migration) File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/migrators.py", line 99, in run return self.run_migration(migration) File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/migrators.py", line 81, in run_migration migration_function() File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/migration/migrators.py", line 57, in <lambda> return (lambda: direction(orm)) File "/usr/lib/python2.6/site-packages/django_dbtemplates-1.3-py2.6.egg/dbtemplates/migrations/0001_initial.py", line 18, in forwards ('last_changed', self.gf('django.db.models.fields.DateTimeField')(default=datetime.datetime.now)), File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/db/generic.py", line 226, in create_table ', '.join([col for col in columns if col]), File "/usr/lib/python2.6/site-packages/South-0.7.3-py2.6.egg/south/db/generic.py", line 150, in execute cursor.execute(sql, params) File "/usr/lib/python2.6/site-packages/django/db/backends/util.py", line 34, in execute return self.cursor.execute(sql, params) File "/usr/lib/python2.6/site-packages/django/db/backends/mysql/base.py", line 86, in execute return self.cursor.execute(query, args) File "/usr/lib64/python2.6/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.6/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue _mysql_exceptions.OperationalError: (1050, "Table 'django_template' already exists") Also i run python manage.py migrate --list, and uotput is: dbtemplates (*) 0001_initial (*) 0002_auto__del_unique_template_name contactform (*) 0001_initial (*) 0002_auto__add_callback (*) 0003_auto__add_field_callback_notes (*) 0004_auto__add_field_callback_is_closed__add_field_callback_closed (*) 0005_auto__add_field_callback_url (*) 0006_auto__add_contact (*) 0007_auto__add_field_contact_category (*) 0008_auto__add_field_contact_url links (*) 0001_initial (*) 0002_auto__add_field_category_enabled__add_field_category_order media (*) 0001_initial (*) 0002_auto__del_field_image_external_url__add_field_image_link_url__del_fiel (*) 0003_add_model_FileAttachment (*) 0004_auto__chg_field_file_slug__chg_field_image_slug (*) 0005_auto__chg_field_image_file (*) 0006_auto__chg_field_file_file pages (*) 0001_initial (*) 0002_auto__chg_field_page_meta_description__chg_field_page_meta_title__chg_ (*) 0003_auto__add_field_page_show_in_sitemap (*) 0004_auto__add_field_page_changefreq__add_field_page_priority popularity (*) 0001_initial testimonials (*) 0001_initial (*) 0002_auto__add_field_testimonial_is_featured brands (*) 0001_initial (*) 0002_auto__add_field_brand_template (*) 0003_auto__chg_field_brand_meta_description__chg_field_brand_meta_title__ch (*) 0004_auto__add_field_brand_url (*) 0005_auto__del_field_brand_image__add_field_brand_logo collections (*) 0001_initial (*) 0002_auto__add_field_collection_discount (*) 0003_auto__chg_field_collection_meta_description__chg_field_collection_meta (*) 0004_auto__add_field_collection_is_featured (*) 0005_auto__add_field_collection_order discount (*) 0001_initial (*) 0002_added_field_discount_description (*) 0003_auto__add_field_discountvoucher_automatic (*) 0004_auto__add_field_discountvoucher_collection (*) 0005_auto__del_field_discountvoucher_collection (*) 0006_auto__chg_field_discountvoucher_expiry_date pricing (*) 0001_initial (*) 0002_auto__add_pricingrule product_types (*) 0001_initial (*) 0002_auto__add_field_producttype_meta_title__add_field_producttype_meta_des (*) 0003_auto__add_field_producttype_summary__add_field_producttype_description products (*) 0001_initial (*) 0002_auto__del_field_product_is_featured (*) 0003_auto__chg_field_product_meta_keywords__chg_field_product_meta_descript (*) 0004_auto shipping (*) 0001_initial (*) 0002_auto__add_field_shippingmethod_includes_tax__add_field_shippingmethod_ (*) 0003_auto__add_field_shippingmethod_order (*) 0004_auto__del_field_shippingmethod_tax_rate__del_field_shippingmethod_incl (*) 0005_auto__del_field_shippingrule_enabled tax (*) 0001_initial (*) 0002_auto__add_field_taxrate_internal_name (*) 0003_initial_internal_names (*) 0004_auto__add_unique_taxrate_internal_name (*) 0005_force_unique_taxrate_name (*) 0006_auto__add_unique_taxrate_name After that some images source were something like this: src="cache/1e/bd/1ebd719910aa843238028edd5fe49e71.jpg" Is any1 could help me with syncdb pledase?

    Read the article

  • SSIS packages incompatibilities between SSIS 2008 and SSIS 2008 R2

    - by Marco Russo (SQLBI)
    When you install SQL 2008 R2 workstation components you get a newer version of BIDS (BI Developer Studio, included in the workstation components) that replaces BIDS 2008 version (BIDS 2005 still live side-by-side). Everything would be good if you can use the newer version to edit any 2008 AND 2008R2 project. SSIS editor doesn't offer a way to set the "compatibility level" of the package, becuase it is almost all unchanged. However, if a package has an ADO.NET Destination Adapter, there is a difference...(read more)

    Read the article

  • Is there a good Lotus Notes open-source alternative?

    - by Ben S
    At my work we use Lotus Notes 6.5 for our email, meeting scheduling and instant messaging. I can't stand the horrible UI, buggy meeting scheduling and overall '90s feel when using it and would love to replace it with open-source alternatives. So far I've been able to setup Thunderbird for email, and I should also be able to configure pidgin to do IM, but I can't find any replacement for the meeting scheduling. I need to be able to receive meeting requests and respond to them. I've looked around trying to get the Thunderbird plugin Lightning to manage the scheduling, but everything I've read so far requires me to export .ics files from Lotus Notes or otherwise keep Lotus Notes around for day-to-day activities. I've also looked into using Evolution as the client, but I found even less information for it than I did for Thunderbird. How can I easily send, receive and respond to Lotus Notes meetings using an open-source alternative? Alternatively, if there exists a full drop-in replacement to Lotus Notes I would also consider it. Note: My desktop at work is a Windows XP machine, though I wouldn't be opposed to a solution requiring cygwin at this point. Edit: I have no power over the server. I only want a compatible client.

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >