Search Results

Search found 2109 results on 85 pages for 'depends'.

Page 38/85 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • How to install VLC? When i get this error?

    - by YumYumYum
    How to install VLC? (with error showing such). root@sun-desktop:/var/tmp# apt-get install vlc Reading package lists... Done Building dependency tree Reading state information... Done vlc is already the newest version. The following packages were automatically installed and are no longer required: liblash3 libreoffice-l10n-common libgsf-1-common libcutter-dev pocketsphinx-hmm-wsj1 libfluidsynth1 libftgl2 projectm-data libprojectm-qt1 libgnomevfs2-extra libbml0 libprojectm2 libpocketsphinx1 libsphinxbase1 buzztard-data libbabl-0.0-0 libgegl-0.0-0 libhal1 libgsf-1-114 libsidplay1 pocketsphinx-utils liboil0.3 pocketsphinx-lm-wsj libcutter0 cutter-testing-framework-bin Use 'apt-get autoremove' to remove them. 0 upgraded, 0 newly installed, 0 to remove and 239 not upgraded. 2 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? y Setting up vlc-nox (1.1.9-1ubuntu1.3) ... /var/lib/dpkg/info/vlc-nox.postinst: 10: /usr/lib/vlc/vlc-cache-gen: not found dpkg: error processing vlc-nox (--configure): subprocess installed post-installation script returned error exit status 127 dpkg: dependency problems prevent configuration of vlc: vlc depends on vlc-nox (= 1.1.9-1ubuntu1.3); however: Package vlc-nox is not configured yet. dpkg: error processing vlc (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: vlc-nox vlc E: Sub-process /usr/bin/dpkg returned an error code (1) # sudo apt-get autoremove vlc vlc-nox Reading package lists... Done Building dependency tree Reading state information... Done Package vlc is not installed, so not removed Package vlc-nox is not installed, so not removed 0 upgraded, 0 newly installed, 0 to remove and 237 not upgraded.

    Read the article

  • SOA Governance Starts with People and Processes

    - by Jyothi Swaroop
    While we all agree that SOA Governance is about People, Processes and Technology. Some experts are of the opinion that SOA Governance begins with People and Processes but needs to be empowered with technology to achieve the best results. Here's an interesting piece from David Linthicum on eBizq: In the world of SOA, the concept of SOA governance is getting a lot of attention. However, how SOA governance is defined and implemented really depends on the SOA governance vendor who just left the building within most enterprises. Indeed, confusion is a huge issue when considering SOA governance, and the core issues are more about the fundamentals of people and processes, and not about the technology. SOA governance is a concept used for activities related to exercising control over services in an SOA, including tracking the services, monitoring the service, and controlling changes made to the services, simple put. The trouble comes in when SOA governance vendors attempt to define SOA governance around their technology, all with different approaches to SOA governance. Thus, it's important that those building SOAs within the enterprise take a step back and understand what really need to support the concept of SOA governance. The value of SOA governance is pretty simple. Since services make up the foundation of an SOA, and are at their essence the behavior and information from existing systems externalized, it's critical to make sure that those accessing, creating, and changing services do so using a well controlled and orderly mechanism. Those of you, who already have governance in place, typically around enterprise architecture efforts, will be happy to know that SOA governance does not replace those processes, but becomes a mechanism within the larger enterprise governance concept. People and processes are first thing on the list to get under control before you begin to toss technology at this problem. This means establishing an understanding of SOA governance within the team members, including why it's important, who's involved, and the core processes that are to be follow to make SOA governance work. Indeed, when creating the core SOA governance strategy should really be independent of the technology. The technology will change over the years, but the core processes and discipline should be relatively durable over time.

    Read the article

  • not able to upgrade maas to 1.4?

    - by SaM
    I am running ubuntu 13.04 LTS, and maas version runnung is maas 1.3+bzr1470+dfsg-0+1474+175~ppa0~ubuntu13.04.1, so i'm trying to upgrade it to mass 1.4 but its failing, sam@xsmaas01:~$ sudo apt-get install maas [sudo] password for sam: Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be upgraded: maas 1 upgraded, 0 newly installed, 0 to remove and 87 not upgraded. 2 not fully installed or removed. Need to get 0 B/1,912 B of archives. After this operation, 1,024 B of additional disk space will be used. (Reading database ... 85268 files and directories currently installed.) Preparing to replace maas 1.3+bzr1470+dfsg-0+1474+175~ppa0~ubuntu13.04.1 (using .../maas_1.4+bzr1693+dfsg-0ubuntu2~ctools0_all.deb) ... Unpacking replacement maas ... Setting up maas-cluster-controller (1.4+bzr1693+dfsg-0ubuntu2~ctools0) ... ERROR: Module version does not exist! dpkg: error processing maas-cluster-controller (--configure): subprocess installed post-installation script returned error exit status 1 dpkg: dependency problems prevent configuration of maas: maas depends on maas-cluster-controller; however: Package maas-cluster-controller is not configured yet. dpkg: error processing maas (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: maas-cluster-controller maas E: Sub-process /usr/bin/dpkg returned an error code (1) sam@maas01:~$ Can anyone help me with this?

    Read the article

  • How to solve package issues/dependencies

    - by Wolfgang Kuehne
    Background info I am trying to install Veins simulation environment by following the tutorial provided by the author. In step 1 it is required to install some packages in Linux, the tutorial suggest this commands to be executed on Terminal: sudo apt-get install build-essential gcc g++ bison flex perl tcl-dev tk-dev blt libxml2-dev zlib1g-dev default-jre doxygen graphviz libwebkitgtk-1.0-0 openmpi-bin libopenmpi-dev libpcap-dev autoconf automake libtool libxerces-c2-dev proj libgdal1-dev libfox-1.6-dev When I execute this command, I immediately get: E: Package 'proj' has no installation candidate Then I remove the proj from the command and execute it again without proj in it, next I get: The following packages have unmet dependencies: libgdal1-dev : Depends: libgdal-dev but it is not going to be installed E: Unable to correct problems, you have held broken packages. So, I remove libgdal1-dev from the command as well. And it executes file, by downloading the remaining packages. To troubleshoot the problem with proj and libdgal1-dev I go to the Synaptic Package Manager. libgdal1-dev I search for libgdal1-dev in Synaptic Package Manager and I get an entry. I Mark for Installation and then Synaptic Package Manager suggests removing libxerces-c2-dev which is actually added via the initial command. Should I trust Synaptic Package Manager with this suggestion, and proceed further? proj What should I do about proj. There are some packages in Synaptic Package Manager such as proj-bin or libproj-dev. Should I install them? I think proj has to do with this and this What should I do to make sure that this simulation tool works fine?

    Read the article

  • Git-based storage and publishing, infrastructure advice

    - by Joel Martinez
    I wanted to get some advice on moving a system to "the cloud" ... specifically, I'm looking to move into some of Windows Azure's managed services, as right now I'm managing a VM. Basically, the system operates on some data stored in a github git repository. I'll describe the current architecture: Current system (all hosted on a single server): GitHub - configured with a webhook pointing at ... ASP.NET MVC application - to accept the webhook from git. It pushes a message onto ... Azure service bus Queue - which is drained by ... Windows Service - pulls the message from the queue and ... Fetches the latest data from the git repository (using GitLib2Sharp) onto the local disk and finally ... Operates on the data in git to produce a static HTML website hosted/served by IIS. The system works really well, actually ... but I would like to get out of the business of managing the VM, and move to using some combination of Azure web and worker roles. But because the system relies so heavily on the git repository on the local filesystem, I'm finding it difficult to figure out how to architect in the cloud. I know you can get file system access, so in theory I could just fetch the repository if there's nothing on disk ... but the performance/responsiveness of the system sort of depends on the repository being available and only having to fetch diffs, which is relatively quick. As opposed to periodically having to fetch the entire (somewhat large) git repository if the web or worker role was recycled, or something. So I would love some advice on how you would architect such a system :) Ultimately, the only real requirement is to be able to serve HTML content that's been produced from the contents of a git repository (in a relatively responsive manner, from a publishing perspective) ... please feel free to ask any clarifying questions if there's something I omitted. Thanks!

    Read the article

  • Disable YouTube Comments while using Chrome

    - by Asian Angel
    Are you tired of the comments at YouTube being full of profanity or irritating to look at? Now you can watch videos minus the comments with the No YouTube Comments extension for Google Chrome. Before Sometimes reading through the comments at YouTube can be interesting or even fun but at other times they can be a bother to you. It all depends on the contents of the comments (i.e. lots of profanity) and what you would like to have visible while watching videos. After Once you have installed the extension all that you will need to do is refresh the page (if you were already watching a video). As you can see any and all signs of the comments section is gone. All nice and clean now… Conclusion If you love to check out your favorite videos on YouTube but are annoyed by the multitude of meaningless and profane comments, this extension for Chrome gets the job done. Links Download the No YouTube Comments extension (Google Chrome Extensions) Similar Articles Productive Geek Tips Remove Unsuitable Comments from YouTubeImprove YouTube Video Viewing in Google ChromeGain Access to a Search Box in Google ChromeQuick Tip: Disable Favicons in FirefoxDisable Favorite Links Panel in Windows Vista Explorer TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Convert the Quick Launch Bar into a Super Application Launcher Automate Tasks in Linux with Crontab Discover New Bundled Feeds in Google Reader Play Music in Chrome by Simply Dragging a File 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family

    Read the article

  • configure: error: Could not find libavformat - part of ffmpeg after installing libav from source

    - by Patryk
    I tried to install minidlna on my machine but it appeared to me that it's not in repositories anymore. Well then I decided to compile it myself. After downloading version 1.1.3 I tried to compile but I needed libav headers which I couldn't install via apt - no idea why there has been a lot of broken packages: $ sudo apt-get install libavcodec-dev Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: libavcodec-dev : Depends: libavutil-dev (= 6:9.13-0ubuntu0.14.04.1) but it is not going to be installed E: Unable to correct problems, you have held broken packages. Anyways I installed libav 9.13 from source and now I came to this point: $ ./configure ... checking for av_open_input_file in -lavformat... no checking for avformat_open_input in -lavformat... no checking for av_open_input_file in -lavformat... no checking for avformat_open_input in -lavformat... no configure: error: Could not find libavformat - part of ffmpeg but I have installed that! Even in the install log I can see : ... INSTALL libavdevice/libavdevice.a INSTALL libavfilter/libavfilter.a INSTALL libavformat/libavformat.a INSTALL libavresample/libavresample.a INSTALL libavcodec/libavcodec.a INSTALL libswscale/libswscale.a INSTALL libavutil/libavutil.a INSTALL libavdevice/avdevice.h INSTALL libavdevice/version.h INSTALL libavdevice/libavdevice.pc INSTALL libavfilter/avfilter.h INSTALL libavfilter/avfiltergraph.h INSTALL libavfilter/buffersink.h INSTALL libavfilter/buffersrc.h INSTALL libavfilter/version.h INSTALL libavfilter/libavfilter.pc INSTALL libavformat/avformat.h ....

    Read the article

  • After force-installing a 32-bit deb failed, how can I install the 64-bit version?

    - by TryTryAgain
    I tried to dpkg -i --force-architecture google-earth-stable_i386.deb and it failed. But now when I try to install the amd64.deb it fails saying dpkg: error processing google-earth-stable_current_amd64.deb (--install): google-earth-stable: 6.2.2.6613-r0 (Multi-Arch: no) is not co-installable with google-earth-stable:i386 6.2.2.6613-r0 (Multi-Arch: no) which is currently installed Errors were encountered while processing: google-earth-stable_current_amd64.deb somehow it thinks the i386 version is installed. No google-earth files or directories even exist. sudo dpkg --configure -a outputs: dpkg: dependency problems prevent configuration of google-earth-stable:i386: google-earth-stable:i386 depends on lsb-core (= 3.2). dpkg: error processing google-earth-stable:i386 (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: google-earth-stable:i386 so it does exist in some capacity. sudo apt-get -f install does nothing out of the ordinary: Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. The weird thing is that synaptic doesn't show any google earth package available let alone installed, nothing under the broken filter either. I have also tried sudo apt-get autoremove and sudo apt-get autoclean So, my question: How can I get rid of this issue?

    Read the article

  • Advantages of Singleton Class over Static Class?

    Point 1)Singleton We can get the object of singleton and then pass to other methods.Static Class We can not pass static class to other methods as we pass objectsPoint 2) Singleton In future, it is easy to change the logic of of creating objects to some pooling mechanism. Static Class Very difficult to implement some pooling logic in case of static class. We would need to make that class as non-static and then make all the methods non-static methods, So entire your code needs to be changed.Point3:) Singleton Can Singletone class be inherited to subclass? Singleton class does not say any restriction of Inheritence. So we should be able to do this as long as subclass is also inheritence.There's nothing fundamentally wrong with subclassing a class that is intended to be a singleton. There are many reasons you might want to do it. and there are many ways to accomplish it. It depends on language you use.Static Class We can not inherit Static class to another Static class in C#. Think about it this way: you access static members via type name, like this: MyStaticType.MyStaticMember(); Were you to inherit from that class, you would have to access it via the new type name: MyNewType.MyStaticMember(); Thus, the new item bears no relationships to the original when used in code. There would be no way to take advantage of any inheritance relationship for things like polymorphism. span.fullpost {display:none;}

    Read the article

  • Google earth will not reinstall

    - by chad
    I was trying to perform the the fix found at this link http://www.omgubuntu.co.uk/2012/01/how-to-make-google-earth-look-native-in-ubuntu It requires you to delete certain files from the /opt/google/earth/free folder and then add some new ones that you download. I deleted the files but the links to download the new ones were unusable. I was using gksudo nautilus so trash was disabled meaning I could not restore the files I had deleted. I the tried to go to the Google Earth website and reinstall it. I downloaded the .deb but when I tried to install it it gav me an error message saying "cannot install ia32-libs" I tried installing this via terminal and it gave me an error message saying chad@chad-Lenovo-G570:~$ sudo apt-get install ia32-libs [sudo] password for chad: Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: ia32-libs : Depends: ia32-libs-multiarch E: Unable to correct problems, you have held broken packages. How do I fix this? Now I am stuck without a functioning Google Earth. How can I fix this?

    Read the article

  • Error when upgrading initscripts using dist-upgrade on a cd image using uck

    - by InkBlend
    I was using UCK to customize an Ubuntu 11.10 image, and ran a dist-upgrade on it (from the console) to try to update all of the packages on it. The upgrade worked successfully for all but two packages, so I tried again and got the same error message. This is what happened: # sudo apt-get dist-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Calculating upgrade... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 2 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? y Setting up initscripts (2.88dsf-13.10ubuntu4.1) ... guest environment detected: Linking /run/shm to /dev/shm rmdir: failed to remove `/run/shm': Device or resource busy Can't symlink /run/shm to /dev/shm; please fix manually. dpkg: error processing initscripts (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of ifupdown: ifupdown depends on initscripts (>= 2.88dsf-13.3); however: Package initscripts is not configured yet. dpkg: error processing ifupdown (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: initscripts ifupdown E: Sub-process /usr/bin/dpkg returned an error code (1) # How can I fix this? Is the problem with the ISO, dpkg, or can I just not upgrade some packages with UCK? I am using Ubuntu 11.10 to and UCK 2.4.5 to customize an Ubuntu 11.10 image.

    Read the article

  • CodePlex Daily Summary for Tuesday, May 22, 2012

    CodePlex Daily Summary for Tuesday, May 22, 2012Popular ReleasesBlackJumboDog: Ver5.6.3: 2012.05.22 Ver5.6.3  (1) HTTP????????、ftp://??????????????????????Internals Viewer (updated) for SQL Server 2008 R2.: Internals Viewer for SSMS 2008 R2: Updated code to work with SSMS 2008 R2. Changed dependancies, removing old assemblies no longer present and replacing them with updated versions.Orchard Project: Orchard 1.4.2: This is a service release to address 1.4 and 1.4.1 bugs. Please read our release notes for Orchard 1.4.2: http://docs.orchardproject.net/Documentation/Orchard-1-4-Release-NotesVirtu: Virtu 0.9.2: Source Requirements.NET Framework 4 Visual Studio 2010 with SP1 or Visual Studio 2010 Express with SP1 Silverlight 5 Tools for Visual Studio 2010 with SP1 Windows Phone 7 Developer Tools (which includes XNA Game Studio 4) Binaries RequirementsSilverlight 5 .NET Framework 4 XNA Framework 4SharePoint Euro 2012 - UEFA European Football Predictor: havivi.euro2012.wsp (1.0): New fetures:View other users predictions Hide/Show background image (web part property) Installing SharePoint Euro 2012 PredictorSharePoint Euro 2012 Predictor has been developed as a SharePoint Sandbox solution to support SharePoint Online (Office 365) Download the solution havivi.euro2012.wsp from the download page: Downloads Upload this solution to your Site Collection via the solutions area. Click on Activate to make the web parts in the solution available for use in the Site C...State Machine .netmf: State Machine Example: First release.... Contains 3 state machines running on separate threads. Event driven button to change the states. StateMachineEngine to support the Machines Message class with the type of data to send between statesSilverlight socket component: Smark.NetDisk: Smark.NetDisk?????Silverlight ?.net???????????,???????????????????????。Smark.NetDisk??????????,????.net???????????????????????tcp??;???????Silverlight??????????????????????callisto: callisto 2.0.28: Update log: - Extended Scribble protocol. - Updated HTML5 client code - now supports the latest versions of Google Chrome.ExtAspNet: ExtAspNet v3.1.6: ExtAspNet - ?? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ?????????? ExtAspNet ????? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ??????????。 ExtAspNet ??????? JavaScript,?? CSS,?? UpdatePanel,?? ViewState,?? WebServices ???????。 ??????: IE 7.0, Firefox 3.6, Chrome 3.0, Opera 10.5, Safari 3.0+ ????:Apache License 2.0 (Apache) ??:http://bbs.extasp.net/ ??:http://demo.extasp.net/ ??:http://doc.extasp.net/ ??:http://extaspnet.codeplex.com/ ??:http://sanshi.cnblogs.com/ ????: +2012-05-20 v3.1.6 -??RowD...Dynamics XRM Tools: Dynamics XRM Tools BETA 1.0: The Dynamics XRM Tools 1.0 BETA is now available Seperate downloads are available for On Premise and Online as certain features are only available On Premise. This is a BETA build and may not resemble the final release. Many enhancements are in development and will be made available soon. Please provide feedback so that we may learn and discover how to make these tools better.WatchersNET CKEditor™ Provider for DotNetNuke®: CKEditor Provider 1.14.05: Whats New Added New Editor Skin "BootstrapCK-Skin" Added New Editor Skin "Slick" Added Dnn Pages Drop Down to the Link Dialog (to quickly link to a portal tab) changes Fixed Issue #6956 Localization issue with some languages Fixed Issue #6930 Folder Tree view was not working in some cases Changed the user folder from User name to User id User Folder is now used when using Upload Function and User Folder is enabled File-Browser Fixed Resizer Preview Image Optimized the oEmbed Pl...PHPExcel: PHPExcel 1.7.7: See Change Log for details of the new features and bugfixes included in this release. BREAKING CHANGE! From PHPExcel 1.7.8 onwards, the 3rd-party tcPDF library will no longer be bundled with PHPExcel for rendering PDF files through the PDF Writer. The PDF Writer is being rewritten to allow a choice of 3rd party PDF libraries (tcPDF, mPDF, and domPDF initially), none of which will be bundled with PHPExcel, but which can be downloaded seperately from the appropriate sites.GhostBuster: GhostBuster Setup (91520): Added WMI based RestorePoint support Removed test code from program.cs Improved counting. Changed color of ghosted but unfiltered devices. Changed HwEntries into an ObservableCollection. Added Properties Form. Added Properties MenuItem to Context Menu. Added Hide Unfiltered Devices to Context Menu. If you like this tool, leave me a note, rate this project or write a review or Donate to Ghostbuster. Donate to GhostbusterEXCEL??、??、????????:DataPie(??MSSQL 2008、ORACLE、ACCESS 2007): DataPie_V3.2: V3.2, 2012?5?19? ????ORACLE??????。AvalonDock: AvalonDock 2.0.0795: Welcome to the Beta release of AvalonDock 2.0 After 4 months of hard work I'm ready to upload the beta version of AvalonDock 2.0. This new version boosts a lot of new features and now is stable enough to be deployed in production scenarios. For this reason I encourage everyone is using AD 1.3 or earlier to upgrade soon to this new version. The final version is scheduled for the end of June. What is included in Beta: 1) Stability! thanks to all users contribution I’ve corrected a lot of issues...myCollections: Version 2.1.0.0: New in this version : Improved UI New Metro Skin Improved Performance Added Proxy Settings New Music and Books Artist detail Lot of Bug FixingAspxCommerce: AspxCommerce1.1: AspxCommerce - 'Flexible and easy eCommerce platform' offers a complete e-Commerce solution that allows you to build and run your fully functional online store in minutes. You can create your storefront; manage the products through categories and subcategories, accept payments through credit cards and ship the ordered products to the customers. We have everything set up for you, so that you can only focus on building your own online store. Note: To login as a superuser, the username and pass...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.1616.403): BUG FIX Hide save button when Titles or Descriptions element is selectedMapWindow 6 Desktop GIS: MapWindow 6.1.2: Looking for a .Net GIS Map Application?MapWindow 6 Desktop GIS is an open source desktop GIS for Microsoft Windows that is built upon the DotSpatial Library. This release requires .Net 4 (Client Profile). Are you a software developer?Instead of downloading MapWindow for development purposes, get started with with the DotSpatial template. The extensions you create from the template can be loaded in MapWindow.DotSpatial: DotSpatial 1.2: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Tutorials are available. Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components ...New ProjectsBunch of Small Tools: Il s'agit du code source de petits projets principalement en rapport avec le japonais ou le chinois, destinés à des apprenants de ces langues. D'autres petits programmes de ma création peuvent y être ajoutés à ma discrétion. Ces projets ne sont plus en développement, et le code source présenté ici est mis à disposition dans le cadre de mon portefolio.Cat: summaryClínica DECORação: Projeto desenvolvido em ASP.NETDnD Campaing Manager: A little project to create a full campaing manager for D&D 3.5 DMsExcel add-in for Ranges: Slice, dice, and splice Excel ranges to your hearts content. Use RANGE.KEY to do "named argument" style programing.FirstPong: bla bla blaHP TRIM Stream Record Attachment Web App: A simple ASP.Net 4.0 Web application that streams the electronic attachment of a record from Hewlett Packard's TRIM record management software (HP TRIM), to the browser. It uses Routing to retrieve an attachment based on the Record Number, ie: http://localhost/View/D12/45 Where D12/45 is a HP TRIM record number. Code was originally from a HP TRIM sample for ASP.Net 2.0, but I expanded upon it and converted it to .Net 4.0 and added routing for a nicer URL. Json Services: Json Services is a web services framework that intended to make the creation and consumption of SOA based applications easier and more efficient. Json Services framework is built using the .NET framework 4.0 and depends heavily on reflection features of .NET, it is depends on the great Json.NET library for serialization. Json Services framework is intended to be simple and efficient and can be consumed using a wide range of clients, Java Script clients will gain the benefit of automatic...LAVAA: LAVAAlion: abcloja chocolate: Utilização de C# para criar uma loja virtual.Mega Terrain ++: This project presents a method to render large terrains with multiple materials using the Ogre graphics engine.Minería de datos - Reglas de asociación: Laboratorio N° 2 del curso Minería de datos, semestre 1 año 2012. Universidad de Santiago de ChileMNT Cryptography: A very simple cryptography classMsAccess to Sqlite converter: This project aims to create a small application to convert a MSAccess Database file into SQLite format. It needs the SQLite ADO library in order to workMSI Previewer: MSI Previewer is a tool, which extracts the given msi and displays the directory structure in which the files will be placed after the installation. It helps to preview the directory structure of the files present in the msi, which will be placed exactly after installation. muvonni: css3 stylesheet developmentMyModeler: MyModeler is a modeling tool for IT professional and ArchitectsNewSite: First asp.net test project i have on codeplex. I'm not entirely sure where this project is headed at the moment and only have some initial ideas. More details to follow soon.PHLTest: Test team foundationPhoto Studio Nana: ?????????? ???????? ? ???playm_20120517_00365: just a colaboration plan...PROJETO CÓDIGO ABDI: Códigos da ABDIsample project for data entry: hi this is summary for first project....Sharepoint WP List Sinc Mapper: loren ipsum dolorSocialAuth4Net: SocialAuth4Net is open authentication wrapper for populer social platforms for example Facebook, LinkedIn and soon Twitter, Google+tiger: abctweetc: tweetc is a windows command line twitter client written in F#

    Read the article

  • Moving from windows to linux

    - by rincewind
    I need to reconcile these 2 facts: I don't feel comfortable working on Linux; I need to develop software for Linux. Some background: I have a 10+ years of programming experience on Windows (almost exclusively C/C++, but some .NET as well), I was a user of FreeBSD at home for about 3 years or so (then had to go back to Windows), and I've never had much luck with Linux. And now I have to develop software for Linux. I need a plan. On Windows, you can get away with just knowing a programming language, an API you're coding against, your IDE (VisualStudio) and some very basic tools for troubleshooting (Depends, ProcessExplorer, DebugView, WinDbg). Everything else comes naturally. On Linux, it's a very different story. How the hell would I know what DLL (sorry, Shared Object) would load, if I link to it from Firefox plugin? What's the Linux equivalent of inserting __asm int 3/DebugBreak() in the source and running the program, and then letting the OS call a debugger? Why the hell release builds use something, called appLoader, while debug builds work somehow different? Worst of all: how to provision Linux development environment? So, taking into account that hatred is usually associated with not knowing enough, what would you recommend? I'm ok with Emacs and GCC. I need to educate myself as a Linux admin/user, and I need to learn proper troubleshooting tools (strace is cool, btw), equivalents to the ones I mentioned above. Do I need to do Linux From Scratch? Or do I need to just read some books (I've read "UNIX programming enviornment" by Kernighan and "Advanced Programming..." by Stevens, but I need to learn something more practical)? Or do I need to have some Linux distro on my home computer?

    Read the article

  • Code Reuse is (Damn) Hard

    - by James Michael Hare
    Being a development team lead, the task of interviewing new candidates was part of my job.  Like any typical interview, we started with some easy questions to get them warmed up and help calm their nerves before hitting the hard stuff. One of those easier questions was almost always: “Name some benefits of object-oriented development.”  Nearly every time, the candidate would chime in with a plethora of canned answers which typically included: “it helps ease code reuse.”  Of course, this is a gross oversimplification.  Tools only ease reuse, its developers that ultimately can cause code to be reusable or not, regardless of the language or methodology. But it did get me thinking…  we always used to say that as part of our mantra as to why Object-Oriented Programming was so great.  With polymorphism, inheritance, encapsulation, etc. we in essence set up the concepts to help facilitate reuse as much as possible.  And yes, as a developer now of many years, I unquestionably held that belief for ages before it really struck me how my views on reuse have jaded over the years.  In fact, in many ways Agile rightly eschews reuse as taking a backseat to developing what's needed for the here and now.  It used to be I was in complete opposition to that view, but more and more I've come to see the logic in it.  Too many times I've seen developers (myself included) get lost in design paralysis trying to come up with the perfect abstraction that would stand all time.  Nearly without fail, all of these pieces of code become obsolete in a matter of months or years. It’s not that I don’t like reuse – it’s just that reuse is hard.  In fact, reuse is DAMN hard.  Many times it is just a distraction that eats up architect and developer time, and worse yet can be counter-productive and force wrong decisions.  Now don’t get me wrong, I love the idea of reusable code when it makes sense.  These are in the few cases where you are designing something that is inherently reusable.  The problem is, most business-class code is inherently unfit for reuse! Furthermore, the code that is reusable will often fail to be reused if you don’t have the proper framework in place for effective reuse that includes standardized versioning, building, releasing, and documenting the components.  That should always be standard across the board when promoting reusable code.  All of this is hard, and it should only be done when you have code that is truly reusable or you will be exerting a large amount of development effort for very little bang for your buck. But my goal here is not to get into how to reuse (that is a topic unto itself) but what should be reused.  First, let’s look at an extension method.  There’s many times where I want to kick off a thread to handle a task, then when I want to reign that thread in of course I want to do a Join on it.  But what if I only want to wait a limited amount of time and then Abort?  Well, I could of course write that logic out by hand each time, but it seemed like a great extension method: 1: public static class ThreadExtensions 2: { 3: public static bool JoinOrAbort(this Thread thread, TimeSpan timeToWait) 4: { 5: bool isJoined = false; 6:  7: if (thread != null) 8: { 9: isJoined = thread.Join(timeToWait); 10:  11: if (!isJoined) 12: { 13: thread.Abort(); 14: } 15: } 16: return isJoined; 17: } 18: } 19:  When I look at this code, I can immediately see things that jump out at me as reasons why this code is very reusable.  Some of them are standard OO principles, and some are kind-of home grown litmus tests: Single Responsibility Principle (SRP) – The only reason this extension method need change is if the Thread class itself changes (one responsibility). Stable Dependencies Principle (SDP) – This method only depends on classes that are more stable than it is (System.Threading.Thread), and in itself is very stable, hence other classes may safely depend on it. It is also not dependent on any business domain, and thus isn't subject to changes as the business itself changes. Open-Closed Principle (OCP) – This class is inherently closed to change. Small and Stable Problem Domain – This method only cares about System.Threading.Thread. All-or-None Usage – A user of a reusable class should want the functionality of that class, not parts of that functionality.  That’s not to say they most use every method, but they shouldn’t be using a method just to get half of its result. Cost of Reuse vs. Cost to Recreate – since this class is highly stable and minimally complex, we can offer it up for reuse very cheaply by promoting it as “ready-to-go” and already unit tested (important!) and available through a standard release cycle (very important!). Okay, all seems good there, now lets look at an entity and DAO.  I don’t know about you all, but there have been times I’ve been in organizations that get the grand idea that all DAOs and entities should be standardized and shared.  While this may work for small or static organizations, it’s near ludicrous for anything large or volatile. 1: namespace Shared.Entities 2: { 3: public class Account 4: { 5: public int Id { get; set; } 6:  7: public string Name { get; set; } 8:  9: public Address HomeAddress { get; set; } 10:  11: public int Age { get; set;} 12:  13: public DateTime LastUsed { get; set; } 14:  15: // etc, etc, etc... 16: } 17: } 18:  19: ... 20:  21: namespace Shared.DataAccess 22: { 23: public class AccountDao 24: { 25: public Account FindAccount(int id) 26: { 27: // dao logic to query and return account 28: } 29:  30: ... 31:  32: } 33: } Now to be fair, I’m not saying there doesn’t exist an organization where some entites may be extremely static and unchanging.  But at best such entities and DAOs will be problematic cases of reuse.  Let’s examine those same tests: Single Responsibility Principle (SRP) – The reasons to change for these classes will be strongly dependent on what the definition of the account is which can change over time and may have multiple influences depending on the number of systems an account can cover. Stable Dependencies Principle (SDP) – This method depends on the data model beneath itself which also is largely dependent on the business definition of an account which can be very inherently unstable. Open-Closed Principle (OCP) – This class is not really closed for modification.  Every time the account definition may change, you’d need to modify this class. Small and Stable Problem Domain – The definition of an account is inherently unstable and in fact may be very large.  What if you are designing a system that aggregates account information from several sources? All-or-None Usage – What if your view of the account encompasses data from 3 different sources but you only care about one of those sources or one piece of data?  Should you have to take the hit of looking up all the other data?  On the other hand, should you have ten different methods returning portions of data in chunks people tend to ask for?  Neither is really a great solution. Cost of Reuse vs. Cost to Recreate – DAOs are really trivial to rewrite, and unless your definition of an account is EXTREMELY stable, the cost to promote, support, and release a reusable account entity and DAO are usually far higher than the cost to recreate as needed. It’s no accident that my case for reuse was a utility class and my case for non-reuse was an entity/DAO.  In general, the smaller and more stable an abstraction is, the higher its level of reuse.  When I became the lead of the Shared Components Committee at my workplace, one of the original goals we looked at satisfying was to find (or create), version, release, and promote a shared library of common utility classes, frameworks, and data access objects.  Now, of course, many of you will point to nHibernate and Entity for the latter, but we were looking at larger, macro collections of data that span multiple data sources of varying types (databases, web services, etc). As we got deeper and deeper in the details of how to manage and release these items, it quickly became apparent that while the case for reuse was typically a slam dunk for utilities and frameworks, the data access objects just didn’t “smell” right.  We ended up having session after session of design meetings to try and find the right way to share these data access components. When someone asked me why it was taking so long to iron out the shared entities, my response was quite simple, “Reuse is hard...”  And that’s when I realized, that while reuse is an awesome goal and we should strive to make code maintainable, often times you end up creating far more work for yourself than necessary by trying to force code to be reusable that inherently isn’t. Think about classes the times you’ve worked in a company where in the design session people fight over the best way to implement a class to make it maximally reusable, extensible, and any other buzzwordable.  Then think about how quickly that design became obsolete.  Many times I set out to do a project and think, “yes, this is the best design, I can extend it easily!” only to find out the business requirements change COMPLETELY in such a way that the design is rendered invalid.  Code, in general, tends to rust and age over time.  As such, writing reusable code can often be difficult and many times ends up being a futile exercise and worse yet, sometimes makes the code harder to maintain because it obfuscates the design in the name of extensibility or reusability. So what do I think are reusable components? Generic Utility classes – these tend to be small classes that assist in a task and have no business context whatsoever. Implementation Abstraction Frameworks – home-grown frameworks that try to isolate changes to third party products you may be depending on (like writing a messaging abstraction layer for publishing/subscribing that is independent of whether you use JMS, MSMQ, etc). Simplification and Uniformity Frameworks – To some extent this is similar to an abstraction framework, but there may be one chosen provider but a development shop mandate to perform certain complex items in a certain way.  Or, perhaps to simplify and dumb-down a complex task for the average developer (such as implementing a particular development-shop’s method of encryption). And what are less reusable? Application and Business Layers – tend to fluctuate a lot as requirements change and new features are added, so tend to be an unstable dependency.  May be reused across applications but also very volatile. Entities and Data Access Layers – these tend to be tuned to the scope of the application, so reusing them can be hard unless the abstract is very stable. So what’s the big lesson?  Reuse is hard.  In fact it’s damn hard.  And much of the time I’m not convinced we should focus too hard on it. If you’re designing a utility or framework, then by all means design it for reuse.  But you most also really set down a good versioning, release, and documentation process to maximize your chances.  For anything else, design it to be maintainable and extendable, but don’t waste the effort on reusability for something that most likely will be obsolete in a year or two anyway.

    Read the article

  • Which Browser is the Best to Use When Running Your Laptop on Battery Power?

    - by Asian Angel
    Squeezing the maximum amount of usage time out of your laptop battery can be challenging at times…it all depends on the software you are using. One software we are all likely to be using is a browser to keep up with our online lives… If your laptop is older, then getting the most out of your laptop’s aging battery is definitely a must. The good folks over at the 7 Tutorials blog have done a comparison test to see which browser is the gentlest on your laptop’s battery and the results may surprise you. You can view the results by visiting the link below… Had better (or worse) luck with one of the browsers tested? Then make sure to share the results with your fellow readers in the comments! Test Comparison: Which Browser Will Make Your Laptop’s Battery Last Longer? [7 Tutorials] How to Own Your Own Website (Even If You Can’t Build One) Pt 3 How to Sync Your Media Across Your Entire House with XBMC How to Own Your Own Website (Even If You Can’t Build One) Pt 2

    Read the article

  • The Customer Experience Revolution is Now

    - by Christie Flanagan
    To conclude this week’s focus on customer experience, I’ll end by recapping how my week began in New York City at The Experience Revolution. We all know that customers increasingly call the shots, and that winning or losing depends on how well we manage to meet their expectations. Today’s customers have a multitude of choices and are quick to jump ship following a poor experience. As a result, delivering an experience that is relevant, interactive, engaging, and consistent across channels and fostering rewarding relationships are increasingly important to business success.  It is only through exceptional customer experiences that companies can expect to acquire new customers and maintain their loyalty.  Over 400 of us gathered at Gotham Hall on Monday night to hear Oracle President Mark Hurd introduce Oracle Customer Experience, a cross-stack suite of customer experience products that include Oracle RightNow CX Cloud Service, Oracle Endeca, Oracle ATG Web Commerce, Oracle WebCenter,Oracle Siebel CRM, Oracle Fusion CRM, Oracle Social Network, and Oracle Knowledge Management. I'd encourage you check out this video to hear Mark explain why having a good product isn't good enough in the wake of the customer experience revolution. The Experience Revolution event itself was designed to deliver the kind of rich experience that sticks with you, using an interactive gallery of customer experience to deliver an individualized experience to each attendee through a combination of touch screens and near field communication technology.  Over the coming weeks we’ll share some of these customer experience vignettes with you. In the interim, you can learn more about Oracle Customer Experience solutions here.

    Read the article

  • Input handling between game loops

    - by user48023
    This may be obvious and trivial for you but as I am a newbie in programming I come with a specific question. I have three loops in my game engine which are input-loop, update-loop and render-loop. Update-loop is set to 10 ticks per second with a fixed timestep, render-loop is capped at around 60 fps and the input-loop runs as fast as possible. I am using one of the Javascript frameworks which provide such things but it doesn't really matter. Let's say I am rendering a tile map and the view of which elements are rendered depends on camera-like movement variables which are modified during key pressing. This is only about camera/viewport and rendering, no game physics involved here. And now, how can I handle input events among these loops to keep consistent engine reaction? Am I supposed to read the current variable modified with input and do some needed calculations in a update-loop and share the result so it could be interpolated in a render-loop? Or read the input effect directly inside the render-loop and put needed calculations inside? I thought interpreting user input inside an update-loop with a low tick rate would be inaccurate and kind of unresponsive while rendering with interpolation in the final view. How it is done properly in games overall?

    Read the article

  • Experience the new Bootloader of CE7 VirtualPC BSP - Display Resolution Override

    - by Kate Moss' Open Space
    The CE 7 (aka. Windows Embedded Compact) provides many new features, a new VirtualPC is one of them and as a replacement of Device Emulator in CE 6.   The bootloader of VPC BSP utilize a new introduced framework in CE7, the BLDR (not the BIOSLOADER!) It provides many rich and advanced feature, I will introduce more detail in my future posts. Today, I am going to introduce a basic usage: setting the display resolution. One of the benefit os using the BLDR is it provides interactive user interface, no DOS enviroment required, so user can change the setting on the console. It is especially useful on VPC: if you are not using Win7, edit a file in VHD could take some effort! In the Boot menu, you can select [5] Display Settings. There are a couples of sub menu allow you to change resolution, bpp and etc. As it is very straight forward, I won't go through each option except to the Option [3] "Change Viewable Display Region". The resolution it provides depends on the BIOS (VPC is a PC compatible device), and the minimum resolution it provides is 640x480. But what if user need smaller resolution or any non-standard resolution for whatever reason, it comes the use of "Change Viewable Display Region". User can use it to create a reduced display region. e.g. 240x320 on 640x480 screen. Also you can alter the platform\virtualpc\src\boot\bldr\config.c to add a non-standard resolution (e.g. 480x272) to displayMode array. Another solution in case of you don't want to rebuilt and replace bootloader is to alter SaveVGAArgs in platform\common\src\x86\common\io\ioctl.c to overwrite cxDisplayScreen and cyDisplayScreen setting to whatever resolution you want.

    Read the article

  • How do you pronounce the '...' operator

    - by Uri
    Now, in c++ '...' became a first class operator. In speech, how do you pronounce it? So far I've heard: dot dot dot triple dot ellipsis related: Is it OK to replace ... with ellipsis in writing? e.g. "The ellipsis operator expands the pack" EDIT (clarification): We are all aware that '...' as a punctuation mark is indeed called ellipsis. But in the context of C++ we don't pronounce the names of the punctuation mark. For example, the '&' operator, depends on the context is pronounced as 'and', 'bitwise and', 'address of', 'logical and' (when && is used), or 'reference'. It is rarely pronounced as 'ampersand'. In speeches, I've a feeling that 'dot dot dot' is used more often. For example: http://channel9.msdn.com/Events/GoingNative/GoingNative-2012/Variadic-Templates-are-Funadic (an excellent presentation about variadic templates). On the other hand, 'dot dot dot' is awkward hard to pronouce ('d' and 't' are both pronounce with the tongue). Can we pronounce it 'unpack'?

    Read the article

  • Permission based Authorization vs. Role based Authorization - Best Practices - 11g

    - by Prakash Yamuna
    In previous blog posts here and here I have alluded to the support in OWSM for Permission based authorization and Role based authorization support. Recently I was having a conversation with an internal team in Oracle looking to use OWSM for their Web Services security needs and one of the topics was around - When to use permission based authorization vs. role based authorization? As in most scenarios the answer is it depends! There are trade-offs involved in using the two approaches and you need to understand the trade-offs and you need to understand which trade-offs are better for your scenario. Role based Authorization: Simple to use. Just create a new custom OWSM policy and specify the role in the policy (using EM Fusion Middleware Control). Inconsistent if you have multiple type of resources in an application (ex: EJBs, Web Apps, Web Services) - ex: the model for securing EJBs with roles or the model for securing Web App roles - is inconsistent. Since the model is inconsistent, tooling is also fairly inconsistent. Achieving this use-case using JDeveloper is slightly complex - since JDeveloper does not directly support creating OWSM custom policies. Permission based Authorization: More complex. You need to attach both an OWSM policy and create OPSS Permission authorization policies. (Note: OWSM leverages OPSS Permission based Authorization support). More appropriate if you have multiple type of resources in an application (ex: EJBs, Web Apps, Web Services) and want a consistent authorization model. Consistent Tooling for managing authorization across different resources (ex: EM Fusion Middleware Control). Better Lifecycle support in terms of T2P, etc. Achieving this use-case using JDeveloper is slightly complex - since JDeveloper does not directly support creating/editing OPSS Permission based authorization policies.

    Read the article

  • Should these concerns be separated into separate objects?

    - by Lewis Bassett
    I have objects which implement the interface BroadcastInterface, which represents a message that is to be broadcast to all users of a particular group. It has a setter and getter method for the Subject and Body properties, and an addRecipientRole() method, which takes a given role and finds the contact token (e.g., an email address) for each user in the role and stores it. It then has a getContactTokens() method. BroadcastInterface objects are passed to an object that implements BroadcasterInterface. These objects are responsible for broadcasting a passed BroadcastInterface object. For example, an EmailBroadcaster implementation of the BroadcasterInterface will take EmailBroadcast objects and use the mailer services to email them out. Now, depending on what BroadcasterInterface implementation is used to broadcast, a different implementation of BroadcastInterface is used by client code. The Single Responsibility Principle seems to suggest that I should have a separate BroadcastFactory object, for creating BroadcastInterface objects, depending on what BroadcasterInterface implementation is used, as creating the BroadcastInterface object is a different responsibility to broadcasting them. But the class used for creating BroadcastInterface objects depends on what implementation of BroadcasterInterface is used to broadcast them. I think, because the knowledge of what method is used to send the broadcasts should only be configured once, the BroadcasterInterface object should be responsible for providing new BroadcastInterface objects. Does the responsibility of “creating and broadcasting objects that implement the BroadcastInterface interface” violate the Single Responsibility Principle? (Because the contact token for sending the broadcast out to the users will differ depending on the way it is broadcasted, I need different broadcast classes—though client code will not be able to tell the difference.)

    Read the article

  • Job title inflation and fluffing

    - by Amir Rezaei
    When you work on the same project for a relative long time you get more experienced. You may also master many new technologies. Besides the coding you may also do what would classify other roles. There is however one part of your career that may not get updated. That is your job title. It seems beside all technological hypes there is also job title hype. It all depends on which company you work for. Many companies give employer better job titles because they want to keep them. The employee doesn’t change their job because the current title is much better, even if they would get better working condition and benefits if they changed their job. When you consider changing you job you notice that your job title is kind of “outdated”. People with less skill have a much better title for their job than you. You may very well explain what you did on your project but the fact is that many employers go by the title. So here are the questions: Do you change your current title in your CV? What are other options? Here are some good readings regarding these phenomena: Job title inflation Job title fluffing

    Read the article

  • Ignore Partial Upgrade -- Google Earth Dependencies

    - by pyraz
    I'm running a 64-bit install of Xubuntu 12.04. It took me a little while to get Google Earth working. The 64-bit Google earth package requires some 32-bit gtk libraries provided by ia32-libs. However, when I ran a simulation to install ia32-libs and it's dependencies, it wanted to remove a ton of programs, including the xubuntu-desktop meta-package. As a work-around, I used getlibs to get the 32-bit libraries I needed, and then installed Google Earth with the deb package and the --ignore-depend option to dpkg. Awesome, Google Earth is installed and is working great! Now, however, Update Manager keeps complaining about a "Partial Upgrade", and apt-get won't let me install any new applications. It wants me to do a fix-broken install, but when I do a simulation of apt-get -f install I get some very bad news, they want to uninstall the Google Earth I just worked so hard to install! $> apt-get -f -s install Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED: googleearth 0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded. Remv googleearth [6.0.3.2197+0.7.0-1] TL;DR The --ignore-depends passed to dpkg is not propagating to apt-get, so now I can't install any new applications until I uninstall Google Earth, because of it's missing dependencies (even though it works fine without them). How can I fix this?

    Read the article

  • Firefox dependencies problem on 12.10

    - by theshu
    I did a fresh install of Ubuntu 12.10 today and now am getting the following error: You might want to run 'apt-get -f install' to correct these. The following packages have unmet dependencies: firefox-globalmenu : Depends: firefox (= 17.0.1+build1-0ubuntu0.12.10.1) but 16.0.1+build1-0ubuntu1 is installed E: Unmet dependencies. Try using -f. When I run the sudo apt-get -f install...I get the following after I type the "y": (Reading database ... 179765 files and directories currently installed.) Preparing to replace firefox 16.0.1+build1-0ubuntu1 (using .../firefox_17.0.1+build1-0ubuntu0.12.10.1_i386.deb) ... Unpacking replacement firefox ... dpkg-deb (subprocess): decompressing archive member: internal gzip read error: '<fd:4>: incorrect data check' dpkg-deb: error: subprocess <decompress> returned error exit status 2 dpkg: error processing /var/cache/apt/archives/firefox_17.0.1+build1-0ubuntu0.12.10.1_i386.deb (--unpack): cannot copy extracted data for './etc/apparmor.d/usr.bin.firefox' to '/etc/apparmor.d/usr.bin.firefox.dpkg-new': unexpected end of file or stream Please restart all running instances of firefox, or you will experience problems. Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Processing triggers for man-db ... Errors were encountered while processing: /var/cache/apt/archives/firefox_17.0.1+build1-0ubuntu0.12.10.1_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) This error is preventing me from installing Synaptic package manager and is also preventing me from installing anything with the Software center or Update manager. I tried uninstalling Firefox and also upgrading Firefox to no avail as well. Open to suggestions.

    Read the article

  • Unable to install mysql in ubuntu

    - by Anand
    I purged my existing mysql-server from ubuntu and re-installed the same. This was due there was an upgrade from my ubuntu and I was unable to start my sql-server. After cleaned up and taking backup of my data , I re-insalled mysql. During installaion I received following message popup. Unable to set password for the MySQL "root" user An error occurred while setting the password for the MySQL administrative user. This may have happened because the account already has a password, or because of a communication problem with the MySQL server. You should check the account's password after the package installation. Please read the /usr/share/doc/mysql-server-5.5/README.Debian file for more information. ¦ ¦ After the installation was failed, following error were received. 121109 20:36:18 InnoDB: Initializing buffer pool, size = 128.0M 121109 20:36:18 InnoDB: Completed initialization of buffer pool 121109 20:36:18 InnoDB: highest supported file format is Barracuda. 121109 20:36:18 InnoDB: Waiting for the background threads to start 121109 20:36:19 InnoDB: 1.1.8 started; log sequence number 2395841 ERROR: 130 Incorrect file format 'user' 121109 20:36:19 [ERROR] Aborting 121109 20:36:19 InnoDB: Starting shutdown... 121109 20:36:20 InnoDB: Shutdown completed; log sequence number 2395841 121109 20:36:20 [Note] /usr/sbin/mysqld: Shutdown complete start: Job failed to start invoke-rc.d: initscript mysql, action "start" failed. dpkg: error processing mysql-server-5.5 (--configure): subprocess installed post-installation script returned error exit status 1 dpkg: dependency problems prevent configuration of mysql-server: mysql-server depends on mysql-server-5.5; however: Package mysql-server-5.5 is not configured yet. dpkg: error processing mysql-server (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: mysql-server-5.5 mysql-server E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >