Search Results

Search found 4084 results on 164 pages for 'r tree'.

Page 63/164 | < Previous Page | 59 60 61 62 63 64 65 66 67 68 69 70  | Next Page >

  • doubt regarding carrying data in custom events using actionscript

    - by user267530
    Hi I am working on actionscript to generate a SWF dynamically using JSON data coming from an HTTP request. I receive the data on creationComplete and try to generate a tree like structure. I don’t create the whole tree at the same time. I create 2 levels, level 1 and level 2. My goal is to attach custom events on the panels which represent tree nodes. When users click the panels, it dispatches custom events and try to generate the next level. So, it goes like this : On creation complete - get JSON- create top tow levels - click on level 2- create the level 2 and level 3 - click on level 3- create level 3 and 4. …and so on and so on. I am attaching my code with this email. Please take a look at it and if you have any hints on how you would do this if you need to paint a tree having total level number = “n” where( 0 import com.iwobanas.effects.*; import flash.events.MouseEvent; import flash.filters.BitmapFilterQuality; import flash.filters.BitmapFilterType; import flash.filters.GradientGlowFilter; import mx.controls.Alert; private var roundedMask:Sprite; private var panel:NewPanel; public var oldPanelIds:Array = new Array(); public var pages:Array = new Array();//cleanup public var delPages:Array = new Array(); public function DrawPlaybook(pos:Number,title:String,chld:Object):void { panel = new NewPanel(chld); panel.title = title; panel.name=title; panel.width = 100; panel.height = 80; panel.x=pos+5; panel.y=40; // Define a gradient glow. var gradientGlow:GradientGlowFilter = new GradientGlowFilter(); gradientGlow.distance = 0; gradientGlow.angle = 45; gradientGlow.colors = [0xFFFFF0, 0xFFFFFF]; gradientGlow.alphas = [0, 1]; gradientGlow.ratios = [0, 255]; gradientGlow.blurX = 10; gradientGlow.blurY = 10; gradientGlow.strength = 2; gradientGlow.quality = BitmapFilterQuality.HIGH; gradientGlow.type = BitmapFilterType.OUTER; panel.filters =[gradientGlow]; this.rawChildren.addChild(panel); pages.push(panel); panel.addEventListener(MouseEvent.CLICK, function(e:MouseEvent){onClickHandler(e,title,chld)}); this.addEventListener(CustomPageClickEvent.PANEL_CLICKED, function(e:CustomPageClickEvent){onCustomPanelClicked(e,title)}); } public function onClickHandler(e:MouseEvent,title:String,chld:Object):void { //var panel:Panel; for each(var stp1:NewPanel in pages){ if(stp1.title==title){ var eventObj:CustomPageClickEvent = new CustomPageClickEvent("panelClicked"); eventObj.panelClicked = stp1; dispatchEvent(eventObj); } } } private function onCustomPanelClicked(e:CustomPageClickEvent,title:String):void { //cleanup itself Alert.show("onCustomPanelClicked" + title); var panel:NewPanel; for each(var stp:NewPanel in pages){ startAnimation(e,stp); } if(title == e.panelClicked.title){ panel = new NewPanel(null); panel.title = title; panel.name=title; panel.width = 150; panel.height = 80; panel.x=100; panel.y=40; this.rawChildren.addChild(panel); // var slideRight:SlideRight = new SlideRight(); slideRight.target=panel; slideRight.duration=750; slideRight.showTarget=true; slideRight.play(); //draw the steps var jsonData = this.map.getValue(title); var posX:Number = 50; var posY:Number = 175; for each ( var pnl:NewPanel in pages){ pages.pop(); } for each ( var stp1:Object in jsonData.children){ //Alert.show("map step=" + stp.text ); panel = new NewPanel(null); panel.title = stp1.text; panel.name=stp1.id; panel.width = 100; panel.id=stp1.id; panel.height = 80; panel.x = posX; panel.y=posY; posX+=150; var s:String="hi" + stp1.text; panel.addEventListener(MouseEvent.CLICK, function(e:MouseEvent){onChildClick(e,s);}); this.addEventListener(CustomPageClickEvent.PANEL_CLICKED, function(e:CustomPageClickEvent){onCustomPnlClicked(e)}); this.rawChildren.addChild(panel); // Alert.show("map step=" + this.getChildIndex(panel) ); // oldPanelIds.push(panel); pages.push(panel); //this.addEventListener(CustomPageClickEvent.PANEL_CLICKED, //function(e:CustomPageClickEvent){onCustomPanelClicked(e,title)}); var slide:SlideUp = new SlideUp(); slide.target=panel; slide.duration=1500; slide.showTarget=false; slide.play(); } } } public function onChildClick(e:MouseEvent,s:String):void { //var panel:Panel; //Alert.show(e.currentTarget.title); for each(var stp1:NewPanel in pages){ if(stp1.title==e.currentTarget.title){ var eventObj:CustomPageClickEvent = new CustomPageClickEvent("panelClicked"); eventObj.panelClicked = stp1; dispatchEvent(eventObj); } } } private function onCustomPnlClicked(e:CustomPageClickEvent):void { for each ( var pnl:NewPanel in pages){ pages.pop(); } //onCustomPanelClicked(e,e.currentTarget.title); //Alert.show("hi from cstm" + e.panelClicked.title); } private function fadePanel(event:Event,panel:NewPanel):void{ panel.alpha -= .005; if (panel.alpha <= 0){ //Alert.show(panel.title); panel.removeEventListener(Event.ENTER_FRAME, function(e:Event){fadePanel(e,panel);}); }; panel.title=""; } private function startAnimation(event:CustomPageClickEvent,panel:NewPanel):void{ panel.addEventListener(Event.ENTER_FRAME, function(e:Event){fadePanel(e,panel)}); } Thanks in advance. Palash

    Read the article

  • Will these optimizations to my Ruby implementation of diff improve performance in a Rails app?

    - by grg-n-sox
    <tl;dr> In source version control diff patch generation, would it be worth it to use the optimizations listed at the very bottom of this writing (see <optimizations>) in my Ruby implementation of diff for making diff patches? </tl;dr> <introduction> I am programming something I have never done before and there might already be tools out there to do the exact thing I am programming but at this point I am having too much fun to care so I am still going to do it from scratch, even if there is a tool for this. So anyways, I am working on a Ruby on Rails app and need a certain feature. Basically I want each entry in a table of mine, let's say for example a table of video games, to have a stored chunk of text that represents a review or something of the sort for that table entry. However, I want this text to be both editable by any registered user and also keep track of different submissions in a version control system. The simplest solution I could think of is just implement a solution that keeps track of the text body and the diff patch history of different versions of the text body as objects in Ruby and then serialize it, preferably in human readable form (so I'll most likely use YAML for this) for editing if needed due to corruption by a software bug or a mistake is made by an admin doing some version editing. So at first I just tried to dive in head first into this feature to find that the problem of generating a diff patch is more difficult that I thought to do efficiently. So I did some research and came across some ideas. Some I have implemented already and some I have not. However, it all pretty much revolves around the longest common subsequence problem, as you would already know if you have already done anything with diff or diff-like features, and optimization the function that solves it. Currently I have it so it truncates the compared versions of the text body from the beginning and end until non-matching lines are found. Then it solves the problem using a comparison matrix, but instead of incrementing the value stored in a cell when it finds a matching line like in most longest common subsequence algorithms I have seen examples of, I increment when I have a non-matching line so as to calculate edit distance instead of longest common subsequence. Although as far as I can tell between the two approaches, they are essentially two sides of the same coin so either could be used to derive an answer. It then back-traces through the comparison matrix and notes when there was an incrementation and in which adjacent cell (West, Northwest, or North) to determine that line's diff entry and assumes all other lines to be unchanged. Normally I would leave it at that, but since this is going into a Rails environment and not just some stand-alone Ruby script, I started getting worried about needing to optimize at least enough so if a spammer that somehow knew how I implemented the version control system and knew my worst case scenario entry still wouldn't be able to hit the server that bad. After some searching and reading of research papers and articles through the internet, I've come across several that seem decent but all seem to have pros and cons and I am having a hard time deciding how well in this situation that the pros and cons balance out. So are the ones listed here worth it? I have listed them with known pros and cons. </introduction> <optimizations> Chop the compared sequences into multiple chucks of subsequences by splitting where lines are unchanged, and then truncating each section of unchanged lines at the beginning and end of each section. Then solve the edit distance of each subsequence. Pro: Changes the time increase as the changed area gets bigger from a quadratic increase to something more similar to a linear increase. Con: Figuring out where to split already seems like you have to solve edit distance except now you don't care how it is changed. Would be fine if this was solvable by a process closer to solving hamming distance but a single insertion would throw this off. Use a cryptographic hash function to both convert all sequence elements into integers and ensure uniqueness. Then solve the edit distance comparing the hash integers instead of the sequence elements themselves. Pro: The operation of comparing two integers is faster than the operation of comparing two strings, so a slight performance gain is received after every comparison, which can be a lot overall. Con: Using a cryptographic hash function takes time to convert all the sequence elements and may end up costing more time to do the conversion that you gain back from the integer comparisons. You could use the built in hash function for a string but that will not guarantee uniqueness. Use lazy evaluation to only calculate the three center-most diagonals of the comparison matrix and then only calculate additional diagonals as needed. And then also use this approach to possibly remove the need on some comparisons to compare all three adjacent cells as desribed here. Pro: Can turn an algorithm that always takes O(n * m) time and make it so only worst case scenario is that time, best case becomes practically linear, and average case is somewhere between the two. Con: It is an algorithm I've only seen implemented in functional programming languages and I am having a difficult time comprehending how to convert this into Ruby based on how it is described at the site linked to above. Make a C module and do the hard work at the native level in C and just make a Ruby wrapper for it so Ruby can make all the calls to it that it needs. Pro: I have to imagine that evaluating something like this in could be a LOT faster. Con: I have no idea how Rails handles apps with ruby code that has C extensions and it hurts the portability of the app. This is an optimization for after the solving of edit distance, but idea is to store additional combined diffs with the ones produced by each version to make a delta-tree data structure with the most recently made diff as the root node of the tree so getting to any version takes worst case time of O(log n) instead of O(n). Pro: Would make going back to an old version a lot faster. Con: It would mean every new commit, the delta-tree would get a new root node that will cost time to reorganize the delta-tree for an operation that will be carried out a lot more often than going back a version, not to mention the unlikelihood it will be an old version. </optimizations> So are these things worth the effort?

    Read the article

  • Any tool to make git build every commit to a branch in a seperate repository?

    - by Wayne
    A git tool that meets the specs below is needed. Does one already exists? If not, I will create a script and make it available on GitHub for others to use or contribute. Is there a completely different and better way to solve the need to build/test every commit to a branch in a git repository? Not just to the latest but each one back to a certain staring point. Background: Our development environment uses a separate continuous integration server which is wonderful. However, it is still necessary to do full builds locally on each developer's PC to make sure the commit won't "break the build" when pushed to the CI server. Unfortunately, with auto unit tests, those build force the developer to wait 10 or 15 minutes for a build every time. To solve this we have setup a "mirror" git repository on each developer PC. So we develop in the main repository but anytime a local full build is needed. We run a couple commands in a in the mirror repository to fetch, checkout the commit we want to build, and build. It's works extremely lovely so we can continue working in the main one with the build going in parallel. There's only one main concern now. We want to make sure every single commit builds and tests fine. But we often get busy and neglect to build several fresh commits. Then if it the build fails you have to do a bisect or manually figure build each interim commit to figure out which one broke. Requirements for this tool. The tool will look at another repo, origin by default, fetch and compare all commits that are in branches to 2 lists of commits. One list must hold successfully built commits and the other lists commits that failed. It identifies any commit or commits not yet in either list and begins to build them in a loop in the order that they were committed. It stops on the first one that fails. The tool appropriately adds each commit to either the successful or failed list after it as attempted to build each one. The tool will ignore any "legacy" commits which are prior to the oldest commit in the success list. This logic makes the starting point possible in the next point. Starting Point. The tool building a specific commit so that, if successful it gets added to the success list. If it is the earliest commit in the success list, it becomes the "starting point" so that none of the commits prior to that are examined for builds. Only linear tree support? Much like bisect, this tool works best on a commit tree which is, at least from it's starting point, linear without any merges. That is, it should be a tree which was built and updated entirely via rebase and fast forward commits. If it fails on one commit in a branch it will stop without building the rest that followed after that one. Instead if will just move on to another branch, if any. The tool must do these steps once by default but allow a parameter to loop with an option to set how many seconds between loops. Other tools like Hudson or CruiseControl could do more fancy scheduling options. The tool must have good defaults but allow optional control. Which repo? origin by default. Which branches? all of them by default. What tool? by default an executable file to be provided by the user named "buildtest", "buildtest.sh" "buildtest.cmd", or buildtest.exe" in the root folder of the repository. Loop delay? run once by default with option to loop after a number of seconds between iterations.

    Read the article

  • ObjectContext.SaveChanges() fails with SQL CE

    - by David Veeneman
    I am creating a model-first Entity Framework 4 app that uses SQL CE as its data store. All is well until I call ObjectContext.SaveChanges() to save changes to the entities in the model. At that point, SaveChanges() throws a System.Data.UpdateException, with an inner exception message that reads as follows: Server-generated keys and server-generated values are not supported by SQL Server Compact. I am completely puzzled by this message. Any idea what is going on and how to fix it? Thanks. Here is the Exception dump: System.Data.UpdateException was unhandled Message=An error occurred while updating the entries. See the inner exception for details. Source=System.Data.Entity StackTrace: at System.Data.Mapping.Update.Internal.UpdateTranslator.Update(IEntityStateManager stateManager, IEntityAdapter adapter) at System.Data.EntityClient.EntityAdapter.Update(IEntityStateManager entityCache) at System.Data.Objects.ObjectContext.SaveChanges(SaveOptions options) at System.Data.Objects.ObjectContext.SaveChanges() at FsDocumentationBuilder.ViewModel.Commands.SaveFileCommand.Execute(Object parameter) in D:\Users\dcveeneman\Documents\Visual Studio 2010\Projects\FsDocumentationBuilder\FsDocumentationBuilder\ViewModel\Commands\SaveFileCommand.cs:line 68 at MS.Internal.Commands.CommandHelpers.CriticalExecuteCommandSource(ICommandSource commandSource, Boolean userInitiated) at System.Windows.Controls.Primitives.ButtonBase.OnClick() at System.Windows.Controls.Button.OnClick() at System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e) at System.Windows.UIElement.OnMouseLeftButtonUpThunk(Object sender, MouseButtonEventArgs e) at System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget) at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target) at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent) at System.Windows.UIElement.OnMouseUpThunk(Object sender, MouseButtonEventArgs e) at System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget) at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target) at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args) at System.Windows.UIElement.RaiseTrustedEvent(RoutedEventArgs args) at System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted) at System.Windows.Input.InputManager.ProcessStagingArea() at System.Windows.Input.InputManager.ProcessInput(InputEventArgs input) at System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport) at System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel) at System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, WindowMessage msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs) at MS.Internal.Threading.ExceptionFilterHelper.TryCatchWhen(Object source, Delegate method, Object args, Int32 numArgs, Delegate catchHandler) at System.Windows.Threading.Dispatcher.InvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs) at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam) at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.Run() at System.Windows.Application.RunDispatcher(Object ignore) at System.Windows.Application.RunInternal(Window window) at System.Windows.Application.Run(Window window) at System.Windows.Application.Run() at FsDocumentationBuilder.App.Main() in D:\Users\dcveeneman\Documents\Visual Studio 2010\Projects\FsDocumentationBuilder\FsDocumentationBuilder\obj\x86\Debug\App.g.cs:line 50 at System.AppDomain._nExecuteAssembly(RuntimeAssembly assembly, String[] args) at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args) at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly() at System.Threading.ThreadHelper.ThreadStart_Context(Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean ignoreSyncCtx) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ThreadHelper.ThreadStart() InnerException: System.Data.EntityCommandCompilationException Message=An error occurred while preparing the command definition. See the inner exception for details. Source=System.Data.Entity StackTrace: at System.Data.Mapping.Update.Internal.UpdateTranslator.CreateCommand(DbModificationCommandTree commandTree) at System.Data.Mapping.Update.Internal.DynamicUpdateCommand.CreateCommand(UpdateTranslator translator, Dictionary`2 identifierValues) at System.Data.Mapping.Update.Internal.DynamicUpdateCommand.Execute(UpdateTranslator translator, EntityConnection connection, Dictionary`2 identifierValues, List`1 generatedValues) at System.Data.Mapping.Update.Internal.UpdateTranslator.Update(IEntityStateManager stateManager, IEntityAdapter adapter) InnerException: System.NotSupportedException Message=Server-generated keys and server-generated values are not supported by SQL Server Compact. Source=System.Data.SqlServerCe.Entity StackTrace: at System.Data.SqlServerCe.SqlGen.DmlSqlGenerator.GenerateReturningSql(StringBuilder commandText, DbModificationCommandTree tree, ExpressionTranslator translator, DbExpression returning) at System.Data.SqlServerCe.SqlGen.DmlSqlGenerator.GenerateInsertSql(DbInsertCommandTree tree, List`1& parameters, Boolean isLocalProvider) at System.Data.SqlServerCe.SqlGen.SqlGenerator.GenerateSql(DbCommandTree tree, List`1& parameters, CommandType& commandType, Boolean isLocalProvider) at System.Data.SqlServerCe.SqlCeProviderServices.CreateCommand(DbProviderManifest providerManifest, DbCommandTree commandTree) at System.Data.SqlServerCe.SqlCeProviderServices.CreateDbCommandDefinition(DbProviderManifest providerManifest, DbCommandTree commandTree) at System.Data.Common.DbProviderServices.CreateCommandDefinition(DbCommandTree commandTree) at System.Data.Common.DbProviderServices.CreateCommand(DbCommandTree commandTree) at System.Data.Mapping.Update.Internal.UpdateTranslator.CreateCommand(DbModificationCommandTree commandTree) InnerException:

    Read the article

  • Problem upgrading kernel on debian 3.1

    - by exhuma
    Hi, I have a quite old box in a remote server farm. So I have no direct access. Only remote SSH (and via SSH to a serial console). I haven't updated this box in ages. Now, whenever I want to install a new package, a dependency to glibc appears. Unfortunately, the install of glibc depends on a 2.6 kernel and I am running a venerable 2.4 kernel (one more reason to upgrade). The problem is, that the install of a new kernel has an indirect (over locales) dependency to glibc. So, to install glibc, I need a new kernel. For a new kernel, I need to upgrade glibc. Essentially I am blocked. What's the best way to proceed considering I have no "hardware" access? Here's a quick transcript of the upgrade process: [green:~]% sudo aptitude install linux-image-686 Reading Package Lists... Done Building Dependency Tree Reading extended state information Initializing package states... Done Reading task descriptions... Done The following packages are unused and will be REMOVED: gcc-4.3-base The following NEW packages will be automatically installed: dash libc6-i686 libparse-recdescent-perl linux-image-2.6-686 linux-image-2.6.18-6-686 module-init-tools yaird The following packages have been kept back: adduser apache2 apache2-mpm-prefork apache2-utils apache2.2-common apt apt-utils aptitude autoconf autotools-dev awstats base-files base-passwd [...snip...] util-linux vacation vim vim-common wamerican wbritish wget whiptail whois wwwconfig-common zlib1g The following NEW packages will be installed: dash libc6-i686 libparse-recdescent-perl linux-image-2.6-686 linux-image-2.6.18-6-686 linux-image-686 module-init-tools yaird The following packages will be upgraded: hotplug libc6 2 packages upgraded, 8 newly installed, 1 to remove and 277 not upgraded. Need to get 0B/22.7MB of archives. After unpacking 52.1MB will be used. Do you want to continue? [Y/n/?] Writing extended state information... Done Preconfiguring packages ... (Reading database ... 34065 files and directories currently installed.) Preparing to replace libc6 2.3.6.ds1-13 (using .../libc6_2.7-18lenny2_i386.deb) ... Checking for services that may need to be restarted... Checking init scripts... WARNING: init script for postgresql not found. [ --- libc6 config screen appears here --- ] WARNING: POSIX threads library NPTL requires kernel version 2.6.8 or later. If you use a kernel 2.4, please upgrade it before installing glibc. The installation of a 2.6 kernel _could_ ask you to install a new libc first, this is NOT a bug, and should *NOT* be reported. In that case, please add etch sources to your /etc/apt/sources.list and run: apt-get install -t etch linux-image-2.6 Then reboot into this new kernel, and proceed with your upgrade dpkg: error processing /var/cache/apt/archives/libc6_2.7-18lenny2_i386.deb (--unpack): subprocess pre-installation script returned error exit status 1 Errors were encountered while processing: /var/cache/apt/archives/libc6_2.7-18lenny2_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) Ack! Something bad happened while installing packages. Trying to recover: dpkg: dependency problems prevent configuration of locales: locales depends on glibc-2.7-1; however: Package glibc-2.7-1 is not installed. dpkg: error processing locales (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: locales Reading Package Lists... Done Building Dependency Tree Reading extended state information Initializing package states... Done Reading task descriptions... Done Now, if I follow the instrunctions as promted I get the following. Note that I am using aptitude instead of apt-get to benefit from the better dependency tracking. I did try with apt-get first. But that let me to the same problem. [green:~]% sudo aptitude install -t etch linux-image-2.6.26-2-686 Reading Package Lists... Done Building Dependency Tree Reading extended state information Initializing package states... Done Reading task descriptions... Done E: Unable to correct problems, you have held broken packages. E: Unable to correct dependencies, some packages cannot be installed E: Unable to resolve some dependencies! Some packages had unmet dependencies. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following packages have unmet dependencies: linux-image-2.6.26-2-686: Depends: initramfs-tools (>= 0.55) but it is not installable or yaird (>= 0.0.13) but it is not installable or linux-initramfs-tool which is a virtual package. Any ideas?

    Read the article

  • Why does my performance slow to a crawl I move methods into a base class?

    - by Juliet
    I'm writing different implementations of immutable binary trees in C#, and I wanted my trees to inherit some common methods from a base class. However, I find. I have lots of binary tree data structures to implement, and I wanted move some common methods into in a base binary tree class. Unfortunately, classes which derive from the base class are abysmally slow. Non-derived classes perform adequately. Here are two nearly identical implementations of an AVL tree to demonstrate: AvlTree: http://pastebin.com/V4WWUAyT DerivedAvlTree: http://pastebin.com/PussQDmN The two trees have the exact same code, but I've moved the DerivedAvlTree.Insert method in base class. Here's a test app: using System; using System.Collections.Generic; using System.Diagnostics; using System.Linq; using Juliet.Collections.Immutable; namespace ConsoleApplication1 { class Program { const int VALUE_COUNT = 5000; static void Main(string[] args) { var avlTreeTimes = TimeIt(TestAvlTree); var derivedAvlTreeTimes = TimeIt(TestDerivedAvlTree); Console.WriteLine("avlTreeTimes: {0}, derivedAvlTreeTimes: {1}", avlTreeTimes, derivedAvlTreeTimes); } static double TimeIt(Func<int, int> f) { var seeds = new int[] { 314159265, 271828183, 231406926, 141421356, 161803399, 266514414, 15485867, 122949829, 198491329, 42 }; var times = new List<double>(); foreach (int seed in seeds) { var sw = Stopwatch.StartNew(); f(seed); sw.Stop(); times.Add(sw.Elapsed.TotalMilliseconds); } // throwing away top and bottom results times.Sort(); times.RemoveAt(0); times.RemoveAt(times.Count - 1); return times.Average(); } static int TestAvlTree(int seed) { var rnd = new System.Random(seed); var avlTree = AvlTree<double>.Create((x, y) => x.CompareTo(y)); for (int i = 0; i < VALUE_COUNT; i++) { avlTree = avlTree.Insert(rnd.NextDouble()); } return avlTree.Count; } static int TestDerivedAvlTree(int seed) { var rnd = new System.Random(seed); var avlTree2 = DerivedAvlTree<double>.Create((x, y) => x.CompareTo(y)); for (int i = 0; i < VALUE_COUNT; i++) { avlTree2 = avlTree2.Insert(rnd.NextDouble()); } return avlTree2.Count; } } } AvlTree: inserts 5000 items in 121 ms DerivedAvlTree: inserts 5000 items in 2182 ms My profiler indicates that the program spends an inordinate amount of time in BaseBinaryTree.Insert. Anyone whose interested can see the EQATEC log file I've created with the code above (you'll need EQATEC profiler to make sense of file). I really want to use a common base class for all of my binary trees, but I can't do that if performance will suffer. What causes my DerivedAvlTree to perform so badly, and what can I do to fix it?

    Read the article

  • ASP.NET MVC Paging/Sorting/Filtering a list using ModelMetadata

    - by rajbk
    This post looks at how to control paging, sorting and filtering when displaying a list of data by specifying attributes in your Model using the ASP.NET MVC framework and the excellent MVCContrib library. It also shows how to hide/show columns and control the formatting of data using attributes.  This uses the Northwind database. A sample project is attached at the end of this post. Let’s start by looking at a class called ProductViewModel. The properties in the class are decorated with attributes. The OrderBy attribute tells the system that the Model can be sorted using that property. The SearchFilter attribute tells the system that filtering is allowed on that property. Filtering type is set by the  FilterType enum which currently supports Equals and Contains. The ScaffoldColumn property specifies if a column is hidden or not The DisplayFormat specifies how the data is formatted. public class ProductViewModel { [OrderBy(IsDefault = true)] [ScaffoldColumn(false)] public int? ProductID { get; set; }   [SearchFilter(FilterType.Contains)] [OrderBy] [DisplayName("Product Name")] public string ProductName { get; set; }   [OrderBy] [DisplayName("Unit Price")] [DisplayFormat(DataFormatString = "{0:c}")] public System.Nullable<decimal> UnitPrice { get; set; }   [DisplayName("Category Name")] public string CategoryName { get; set; }   [SearchFilter] [ScaffoldColumn(false)] public int? CategoryID { get; set; }   [SearchFilter] [ScaffoldColumn(false)] public int? SupplierID { get; set; }   [OrderBy] public bool Discontinued { get; set; } } Before we explore the code further, lets look at the UI.  The UI has a section for filtering the data. The column headers with links are sortable. Paging is also supported with the help of a pager row. The pager is rendered using the MVCContrib Pager component. The data is displayed using a customized version of the MVCContrib Grid component. The customization was done in order for the Grid to be aware of the attributes mentioned above. Now, let’s look at what happens when we perform actions on this page. The diagram below shows the process: The form on the page has its method set to “GET” therefore we see all the parameters in the query string. The query string is shown in blue above. This query gets routed to an action called Index with parameters of type ProductViewModel and PageSortOptions. The parameters in the query string get mapped to the input parameters using model binding. The ProductView object created has the information needed to filter data while the PageAndSorting object is used for paging and sorting the data. The last block in the figure above shows how the filtered and paged list is created. We receive a product list from our product repository (which is of type IQueryable) and first filter it by calliing the AsFiltered extension method passing in the productFilters object and then call the AsPagination extension method passing in the pageSort object. The AsFiltered extension method looks at the type of the filter instance passed in. It skips properties in the instance that do not have the SearchFilter attribute. For properties that have the SearchFilter attribute, it adds filter expression trees to filter against the IQueryable data. The AsPagination extension method looks at the type of the IQueryable and ensures that the column being sorted on has the OrderBy attribute. If it does not find one, it looks for the default sort field [OrderBy(IsDefault = true)]. It is required that at least one attribute in your model has the [OrderBy(IsDefault = true)]. This because a person could be performing paging without specifying an order by column. As you may recall the LINQ Skip method now requires that you call an OrderBy method before it. Therefore we need a default order by column to perform paging. The extension method adds a order expressoin tree to the IQueryable and calls the MVCContrib AsPagination extension method to page the data. Implementation Notes Auto Postback The search filter region auto performs a get request anytime the dropdown selection is changed. This is implemented using the following jQuery snippet $(document).ready(function () { $("#productSearch").change(function () { this.submit(); }); }); Strongly Typed View The code used in the Action method is shown below: public ActionResult Index(ProductViewModel productFilters, PageSortOptions pageSortOptions) { var productPagedList = productRepository.GetProductsProjected().AsFiltered(productFilters).AsPagination(pageSortOptions);   var productViewFilterContainer = new ProductViewFilterContainer(); productViewFilterContainer.Fill(productFilters.CategoryID, productFilters.SupplierID, productFilters.ProductName);   var gridSortOptions = new GridSortOptions { Column = pageSortOptions.Column, Direction = pageSortOptions.Direction };   var productListContainer = new ProductListContainerModel { ProductPagedList = productPagedList, ProductViewFilterContainer = productViewFilterContainer, GridSortOptions = gridSortOptions };   return View(productListContainer); } As you see above, the object that is returned to the view is of type ProductListContainerModel. This contains all the information need for the view to render the Search filter section (including dropdowns),  the Html.Pager (MVCContrib) and the Html.Grid (from MVCContrib). It also stores the state of the search filters so that they can recreate themselves when the page reloads (Viewstate, I miss you! :0)  The class diagram for the container class is shown below.   Custom MVCContrib Grid The MVCContrib grid default behavior was overridden so that it would auto generate the columns and format the columns based on the metadata and also make it aware of our custom attributes (see MetaDataGridModel in the sample code). The Grid ensures that the ShowForDisplay on the column is set to true This can also be set by the ScaffoldColumn attribute ref: http://bradwilson.typepad.com/blog/2009/10/aspnet-mvc-2-templates-part-2-modelmetadata.html) Column headers are set using the DisplayName attribute Column sorting is set using the OrderBy attribute. The data is formatted using the DisplayFormat attribute. Generic Extension methods for Sorting and Filtering The extension method AsFiltered takes in an IQueryable<T> and uses expression trees to query against the IQueryable data. The query is constructed using the Model metadata and the properties of the T filter (productFilters in our case). Properties in the Model that do not have the SearchFilter attribute are skipped when creating the filter expression tree.  It returns an IQueryable<T>. The extension method AsPagination takes in an IQuerable<T> and first ensures that the column being sorted on has the OrderBy attribute. If not, we look for the default OrderBy column ([OrderBy(IsDefault = true)]). We then build an expression tree to sort on this column. We finally hand off the call to the MVCContrib AsPagination which returns an IPagination<T>. This type as you can see in the class diagram above is passed to the view and used by the MVCContrib Grid and Pager components. Custom Provider To get the system to recognize our custom attributes, we create our MetadataProvider as mentioned in this article (http://bradwilson.typepad.com/blog/2010/01/why-you-dont-need-modelmetadataattributes.html) protected override ModelMetadata CreateMetadata(IEnumerable<Attribute> attributes, Type containerType, Func<object> modelAccessor, Type modelType, string propertyName) { ModelMetadata metadata = base.CreateMetadata(attributes, containerType, modelAccessor, modelType, propertyName);   SearchFilterAttribute searchFilterAttribute = attributes.OfType<SearchFilterAttribute>().FirstOrDefault(); if (searchFilterAttribute != null) { metadata.AdditionalValues.Add(Globals.SearchFilterAttributeKey, searchFilterAttribute); }   OrderByAttribute orderByAttribute = attributes.OfType<OrderByAttribute>().FirstOrDefault(); if (orderByAttribute != null) { metadata.AdditionalValues.Add(Globals.OrderByAttributeKey, orderByAttribute); }   return metadata; } We register our MetadataProvider in Global.asax.cs. protected void Application_Start() { AreaRegistration.RegisterAllAreas();   RegisterRoutes(RouteTable.Routes);   ModelMetadataProviders.Current = new MvcFlan.QueryModelMetaDataProvider(); } Bugs, Comments and Suggestions are welcome! You can download the sample code below. This code is purely experimental. Use at your own risk. Download Sample Code (VS 2010 RTM) MVCNorthwindSales.zip

    Read the article

  • CodePlex Daily Summary for Monday, March 15, 2010

    CodePlex Daily Summary for Monday, March 15, 2010New ProjectsAT Accounts: AT Accounts helps developers to intergrate accounting functionality in their applications. It has both the WPF userinterface and SilverlightChild page list(for dnn4/5): A free module which can display sub pages list for a selected tab. It is template based and support options like Recursive/Child tab prefix/link...dashCommerce: dashCommerce is the leading ASP.NET e-commerce platform.Fire Utilities: My Development Utiltites and base classes: New Zealand Bank Account ValidatorFlyCatch (Bugtracking System): A simple webbased Bugtracking System.fracback: Fractal feedback concepts, based on video feedbackftc3650: code for ftc 3650Google AJAX Search Services for jQuery: This plug-in encapsulates part of the Google AJAX Search API to streamline the process of Google Search integration.Little Black Book DB: This is the Database for the following Projects: SQL Azure PHP Connection SQL Azure Ruby Connection SQL Azure Python Connection SQL Azure .NE...MediaCommMVC: MediaCommMVC is a community platform focusing on photos, videos and discussions. It's based on ASP.NET MVC and uses (fluent) nhibernate, jquery an...Miracle OS: The Miracle OS is an OS from Fox. We work on it, but it isn't ready. Do you want help us? Please send a mail to victor@fox.fi.stMultiwfn: (1)Plotting various graph(filled color/contour/relief map...) (2)Generate Cube file (3)Manipulate & analyze wavefunction Supportting lots of proper...MySpace DataRelay: Data Relay is the foundation of MySpace's middle tier. At its heart, it is a messaging system for relaying information both between clients and ser...NinjaCMS: Ninja CMS is an asp.net based content management system which provides a designer friendly, developer friendly interface to work with. It's flexibl...open gaze and mouse analyzer: Ogama allows recording and analyzing eye- and mouse-tracking data from slideshow eyetracking experiments in parallel. It´s developed in C#.NET and ...Özkasoft.Net | E-Commerce: Özkasoft's E-Commerce ProjectProfiCV: Profi CVpyTarget: Implement a powerful iscsi target in python, and easily use under most popular systems. It also includes the following features: multi-target, mult...SharePoint Platform Extensions: SharePoint Platform Extensions by Espora. Sorting Algorithm Visualization: Sorting Algorithm Visualization Displays Bead Sort, Binary Tree Sort, Bubble Sort, Bucket Sort, Cocktail Sort, Counting Sort, Gnome Sort, In Place ...Specify: A framework for creating executable specifications in .NET. Spell Corrector: A spell corrector that uses Bayes algorithm and BK (Burkhard-Keller) tree.SQL Azure Ruby Connection: This is a demo to show how to connect to SQL Azure with Ruby on Rails.uManage - AD Self-Service Portal: uManage is an Active Directory Self-Service Portal as well as Help Desk web application designed for use on intranet systems. It allows users to u...Winforms Rounded Group Box Control: Rounded Group Box - A Grouping control with Rounded Corners, Gradients, and Drop ShadowWizard Engine: Host application agnostic wizard engine platform, that allows you to fluently define complex conditional flows and provides means for execution of ...WS-Transfer based File Upload: WS-Transer based upload of large files in multiple partsXAMLStylePad: XAMLStylePad - is a simple in use styles and templates XAML-editor. It designed for comfortable coding in XAML with real-time preview result on aut...Your Twitt Engine: Ovo je aplikacija za sve ljude koji su na svom radnom mjestu pod prismotrom poslodavca ili sefa, koji kontroliraju njihov monitor. Tako uz ovu apl...New ReleasesAmiBroker Plug-ins with C#. A non official AmiBroker Plug-in SDK: AniBroker Plug-in SDK v0.0.5: Removed dependency on .NET 4.0, now it works fine with .NET 2.0BeerMath.net: 0.1: Version 0.1Initial set of calculations supported: IBUs Color ABV/ABWChild page list(for dnn4/5): Child Page List 2.6: Source code is also include in module package.dashCommerce: dashCommerce Releases: You can download both Source and WebReady packages at http://www.dashcommerce.org. If you wish to submit patches, then use the Source Code tab her...ExcelDna: ExcelDna Version 0.23: ExcelDna Version 0.23 2010/03/14 - Packing and other features This release adds a number of features to ExcelDna: Add ExplicitExports attribute to ...Family Tree Analyzer: Version 1.0.7.1: Version 1.0.7.0 Update Census form to show family totals Fix England and Wales Lost Cousins reports to be England OR Wales Problems with Gedcom in...Foursquare BlogEngine Widget: foursquare widget for BlogEngine.NET Version 0.2: To see the changes which have been made, visit http://philippkueng.ch/post/Foursquare-BlogEngineNET-Widget-Version-02.aspx For installation instruc...GLB Virtual Player Builder: 0.4.0 Official Archetypes Release: Updated for new archetypes. The builder still includes the old player formats, and you can still import your old players' builds. Please PM me an...Home Access Plus+: v3.1.4.0: Version 3.1.3.1 Release Change Log: Added Breadcrumbs to My Computer File Changes: ~/bin/CHS Extranet.dll ~/bin/CHS Extranet.pdb ~/images/arro...Little Black Book DB: Little Black Book R1: This is the first release of the Little black book presentation I presented at Confoo. I decided to package the Database along with the Windows Az...mite.net - .NET API for mite: Version 1.2.1: Added Support for budget type Modified TimerMapper to return timers Fixed Encoding issue in xml conversionMultiwfn: multiwfn1.0: multiwfn1.0Multiwfn: multiwfn1.0_source: multiwfn1.0_sourceMultiwfn: multiwfn1.1: multiwfn1.1Multiwfn: multiwfn1.1_source: multiwfn1.1_sourceMultiwfn: multiwfn1.2: 1.2 2010-FEB-9 *加入了对10f型轨道的支持。 *新支持非限制性Post-HF波函数用以计算自旋密度。 *新增加直接读入高斯03/09的fch文件的支持,可以观看NBO轨道,详见readme实例4.10。 *绘制平面图时允许通过输入三个点坐标定义平面,允许自定义平面的原点与平移向...Multiwfn: multiwfn1.2_source: Include all the file that needed by compilation in CVF6.5PowerShell Community Extensions: 2.0 Beta 2: Release NotesThis is a pretty close to final release. We have eliminated all of the names that ran afound of the module loading mechanism which me...pyTarget: pyTarget.binary-for-windows-x86.rar: pyTarget.binary-for-windows-x86.rarpyTarget: pyTarget.src.tar.bz2: pyTarget.src.tar.bz2RedBulb for XNA Framework: RedBulbConsole (Console, Menu and TrackHUD Sample): http://bayimg.com/image/jalhmaacd.jpgScrum Sprint Monitor: 1.0.0.45262 (.NET 4.0 RC): Tested against TFS 2010 RC. For the .NET 3.5 SP1 platform, use the .NET 3.5 SP1 download. What is new in this release? Major performance increase ...sELedit: sELedit v1.1: Removed: Clone and Delete Button Added: Context Menu to Item List Added: Clone and Delete button to Context Menu Added: Export / Import Item ...Sorting Algorithm Visualization: Beta 1: Sorting Algorithm VisualizationSpecify: Version 1.0: Version 1.0Spell Corrector: Spell Corrector 0.1: A basic version that supports basic functionality.Spell Corrector: Spell Corrector 0.1 Source Code: Source code of version 0.1Spiral Architecture Driven Development (SADD): SADD v.0.9: Pre-final release with the NEW materials now all in English ! The Final release is coming soon. After guest column for SADD publication in MS Ar...Spiral Architecture Driven Development (SADD) for Russian: SADD v.0.9: Pre-final release with the NEW materials now all in English ! The Final release is coming soon. After guest column for SADD publication in MS Ar...SQL Azure Ruby Connection: Little Black Book Ruby R1: This is the Ruby Demo that I demostrated at Confoo. Special Thanks to Tony Thompson for putting this demo together. To check out Tony's Portfolio ...The Scrum Factory: The Scrum Factory Server - V1a: This is the newest version of the server. Some minor bugs from version v1 were fixed, and some slighted changed were made some database views.twNowplaying: twNowplaying 1.0.0.4: Please note that the user has to press the Twitter logo to log in the first time the application is started.uManage - AD Self-Service Portal: uManage - v1.0 (.NET 4.0 RC): Initial Release of uManage. NOTE: Designed for ASP.NET and .NET 4.0 RC ONLY! This is the initial release of uManage and covers the first phase of ...Virtu: Virtu 0.8: Source Requirements.NET Framework 3.5 with Service Pack 1 Visual Studio 2008 with Service Pack 1, or Visual C# 2008 Express Edition with Service Pa...Visual Studio DSite: Speech Synthesizer (Text to Speech) in Visual C++: A very simple text to speech program written in visual c 2008.White Tiger: 0.0.4.0: *now you can disable the file security checks *winforms aplications created to manage tablesWinforms Rounded Group Box Control: Release 1.0: To use this control simply add the class to your project and compile it. It will then show up in the projects components section in the toolbox. ...WS-Transfer based File Upload: 0.5: Implements the binary file transfer mechanism onlyXsltDb - DotNetNuke XSLT module: 01.00.89: Super modules configuration names. 16767 - Fixed more bug fixes...Yakiimo3D: DirectX11 Rheinhard Tonemapping Source and Binary: DirectX11 Rheinhard tonemapping source and binary.Your Twitt Engine: test: Slobodno probajte sa vasim twitter korisničkim računomMost Popular ProjectsMetaSharpWBFS ManagerRawrAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitASP.NET Ajax LibraryWindows Presentation Foundation (WPF)ASP.NETLiveUpload to FacebookMost Active ProjectsLINQ to TwitterRawrN2 CMSBlogEngine.NETpatterns & practices – Enterprise LibrarySharePoint Team-MailerjQuery Library for SharePoint Web ServicesCaliburn: An Application Framework for WPF and SilverlightFarseer Physics EngineCalcium: A modular application toolset leveraging Prism

    Read the article

  • jQuery Treeview – Expand and Collapse All Without the TreeControl

    - by Ben Griswold
    The jQuery Treeview Plugin provides collapse all, expand all and toggle all support with very little effort on your part. Simply add a treecontrol with three links, and the treeview, to your page…   <div id="treecontrol">     <a title="Collapse the entire tree below" href="#"><img src="../images/minus.gif" /> Collapse All</a>     <a title="Expand the entire tree below" href="#"><img src="../images/plus.gif" /> Expand All</a>     <a title="Toggle the tree below, opening closed branches, closing open branches" href="#">Toggle All</a> </div> <ul id="treeview" class="treeview-black">     <li>Item 1</li>     <li>         <span>Item 2</span>         <ul>             <li>                 <span>Item 2.1</span>                   <ul>                     <li>Item 2.1.1</li>                     <li>Item 2.1.2</li>                 </ul>             </li>             <li>Item 2.2</li>             <li class="closed">                   <span>Item 2.3 (closed at start)</span>                 <ul>                     <li>Item 2.3.1</li>                     <li>Item 2.3.2</li>                 </ul>             </li>         </ul>     </li> </ul> …and then associate the control to the treeview when defining the treeview settings. $("#treeview").treeview({     control: "#treecontrol",     persist: "cookie",     cookieId: "treeview-black" }); It really couldn’t be easier and it works great! But the plugin doesn’t offer a lot of flexibility when it comes to layout.  For example, the plugin assumes your treecontrol will include links.  If you want buttons or images or whatever, you are out of luck.  The plugin also assumes a set number of links and the collapse all handler is associated with the first link inside of the treecontrol, a:eq(0), the expand all handler is associated with the second link and so on.  So you really can’t incorporate the toggle all link by itself unless you manually hide the other options. Which brings me to the point of this post – making the collapse/expand/toggle layout more flexible without modifying the plugin’s source code. We will continue to use the treecontrol actions but we’re not going to use them directly. In fact, our custom collapse, expand and toggle links will trigger the actions for us.  So, hide the treecontrol links and associate the treecontrol click events with the click events of other controls.  Finally, render the treeview with the same setting definitions as usual. $(document).ready(function() {     // The plugin shows the treecontrol after the     // collapse, expand and toggle events are hooked up     // Just hide the links.     $('#treecontrol a').hide();       // On click of your custom links, button, etc     // Trigger the appropriate treecontrol click     $('#expandAll').click(function() {                 $('#treecontrol a:eq(1)').click();         });          $('#collapseAll').click(function() {         $('#treecontrol a:eq(0)').click();             });       // Render the treeview per usual.         $("#treeview").treeview({         control: "#treecontrol",         persist: "cookie",         cookieId: "treeview-black"     }); }); Since I’m not using the treecontrol directly, I move it to the bottom of the page but it doesn’t really matter where the treecontrol goes. <div>     <a id="collapseAll" href="#">Collapse All Outside of TreeControl</a> </div>   <ul id="treeview" class="treeview-black">     <li>Item 1</li>     <li>         <span>Item 2</span>         <ul>             <li>                 <span>Item 2.1</span>                 <ul>                     <li>Item 2.1.1</li>                     <li>Item 2.1.2</li>                 </ul>             </li>             <li>Item 2.2</li>             <li class="closed">                 <span>Item 2.3 (closed at start)</span>                 <ul>                     <li>Item 2.3.1</li>                     <li>Item 2.3.2</li>                 </ul>             </li>         </ul>     </li> </ul>   <div>     <input type="button" id="expandAll" value="Expand All Outside of TreeControl"/> </div>   <div id="treecontrol">     <a href="#"></a><a href="#"></a><a href="#"></a> </div> The above jQuery and Html snippets generate the following ugly output which shows the separated collapse/expand elements. If you want the toggle all option, I bet you can figure out how to put it in place.  I’ve made the source available below if you’re interested. Download jQuery Treeview Expand and Collapse Super Code

    Read the article

  • Project Navigation and File Nesting in ASP.NET MVC Projects

    - by Rick Strahl
    More and more I’m finding myself getting lost in the files in some of my larger Web projects. There’s so much freaking content to deal with – HTML Views, several derived CSS pages, page level CSS, script libraries, application wide scripts and page specific script files etc. etc. Thankfully I use Resharper and the Ctrl-T Go to Anything which autocompletes you to any file, type, member rapidly. Awesome except when I forget – or when I’m not quite sure of the name of what I’m looking for. Project navigation is still important. Sometimes while working on a project I seem to have 30 or more files open and trying to locate another new file to open in the solution often ends up being a mental exercise – “where did I put that thing?” It’s those little hesitations that tend to get in the way of workflow frequently. To make things worse most NuGet packages for client side frameworks and scripts, dump stuff into folders that I generally don’t use. I’ve never been a fan of the ‘Content’ folder in MVC which is just an empty layer that doesn’t serve much of a purpose. It’s usually the first thing I nuke in every MVC project. To me the project root is where the actual content for a site goes – is there really a need to add another folder to force another path into every resource you use? It’s ugly and also inefficient as it adds additional bytes to every resource link you embed into a page. Alternatives I’ve been playing around with different folder layouts recently and found that moving my cheese around has actually made project navigation much easier. In this post I show a couple of things I’ve found useful and maybe you find some of these useful as well or at least get some ideas what can be changed to provide better project flow. The first thing I’ve been doing is add a root Code folder and putting all server code into that. I’m a big fan of treating the Web project root folder as my Web root folder so all content comes from the root without unneeded nesting like the Content folder. By moving all server code out of the root tree (except for Code) the root tree becomes a lot cleaner immediately as you remove Controllers, App_Start, Models etc. and move them underneath Code. Yes this adds another folder level for server code, but it leaves only code related things in one place that’s easier to jump back and forth in. Additionally I find myself doing a lot less with server side code these days, more with client side code so I want the server code separated from that. The root folder itself then serves as the root content folder. Specifically I have the Views folder below it, as well as the Css and Scripts folders which serve to hold only common libraries and global CSS and Scripts code. These days of building SPA style application, I also tend to have an App folder there where I keep my application specific JavaScript files, as well as HTML View templates for client SPA apps like Angular. Here’s an example of what this looks like in a relatively small project: The goal is to keep things that are related together, so I don’t end up jumping around so much in the solution to get to specific project items. The Code folder may irk some of you and hark back to the days of the App_Code folder in non Web-Application projects, but these days I find myself messing with a lot less server side code and much more with client side files – HTML, CSS and JavaScript. Generally I work on a single controller at a time – once that’s open it’s open that’s typically the only server code I work with regularily. Business logic lives in another project altogether, so other than the controller and maybe ViewModels there’s not a lot of code being accessed in the Code folder. So throwing that off the root and isolating seems like an easy win. Nesting Page specific content In a lot of my existing applications that are pure server side MVC application perhaps with some JavaScript associated with them , I tend to have page level javascript and css files. For these types of pages I actually prefer the local files stored in the same folder as the parent view. So typically I have a .css and .js files with the same name as the view in the same folder. This looks something like this: In order for this to work you have to also make a configuration change inside of the /Views/web.config file, as the Views folder is blocked with the BlockViewHandler that prohibits access to content from that folder. It’s easy to fix by changing the path from * to *.cshtml or *.vbhtml so that view retrieval is blocked:<system.webServer> <handlers> <remove name="BlockViewHandler"/> <add name="BlockViewHandler" path="*.cshtml" verb="*" preCondition="integratedMode" type="System.Web.HttpNotFoundHandler" /> </handlers> </system.webServer> With this in place, from inside of your Views you can then reference those same resources like this:<link href="~/Views/Admin/QuizPrognosisItems.css" rel="stylesheet" /> and<script src="~/Views/Admin/QuizPrognosisItems.js"></script> which works fine. JavaScript and CSS files in the Views folder deploy just like the .cshtml files do and can be referenced from this folder as well. Making this happen is not really as straightforward as it should be with just Visual Studio unfortunately, as there’s no easy way to get the file nesting from the VS IDE directly (you have to modify the .csproj file). However, Mads Kristensen has a nice Visual Studio Add-in that provides file nesting via a short cut menu option. Using this you can select each of the ‘child’ files and then nest them under a parent file. In the case above I select the .js and .css files and nest them underneath the .cshtml view. I was even toying with the idea of throwing the controller.cs files into the Views folder, but that’s maybe going a little too far :-) It would work however as Visual Studio doesn’t publish .cs files and the compiler doesn’t care where the files live. There are lots of options and if you think that would make life easier it’s another option to help group related things together. Are there any downside to this? Possibly – if you’re using automated minification/packaging tools like ASP.NET Bundling or Grunt/Gulp with Uglify, it becomes a little harder to group script and css files for minification as you may end up looking in multiple folders instead of a single folder. But – again that’s a one time configuration step that’s easily handled and much less intrusive then constantly having to search for files in your project. Client Side Folders The particular project shown above in the screen shots above is a traditional server side ASP.NET MVC application with most content rendered into server side Razor pages. There’s a fair amount of client side stuff happening on these pages as well – specifically several of these pages are self contained single page Angular applications that deal with 1 or maybe 2 separate views and the layout I’ve shown above really focuses on the server side aspect where there are Razor views with related script and css resources. For applications that are more client centric and have a lot more script and HTML template based content I tend to use the same layout for the server components, but the client side code can often be broken out differently. In SPA type applications I tend to follow the App folder approach where all the application pieces that make the SPA applications end up below the App folder. Here’s what that looks like for me – here this is an AngularJs project: In this case the App folder holds both the application specific js files, and the partial HTML views that get loaded into this single SPA page application. In this particular Angular SPA application that has controllers linked to particular partial views, I prefer to keep the script files that are associated with the views – Angular Js Controllers in this case – with the actual partials. Again I like the proximity of the view with the main code associated with the view, because 90% of the UI application code that gets written is handled between these two files. This approach works well, but only if controllers are fairly closely aligned with the partials. If you have many smaller sub-controllers or lots of directives where the alignment between views and code is more segmented this approach starts falling apart and you’ll probably be better off with separate folders in js folder. Following Angular conventions you’d have controllers/directives/services etc. folders. Please note that I’m not saying any of these ways are right or wrong  – this is just what has worked for me and why! Skipping Project Navigation altogether with Resharper I’ve talked a bit about project navigation in the project tree, which is a common way to navigate and which we all use at least some of the time, but if you use a tool like Resharper – which has Ctrl-T to jump to anything, you can quickly navigate with a shortcut key and autocomplete search. Here’s what Resharper’s jump to anything looks like: Resharper’s Goto Anything box lets you type and quick search over files, classes and members of the entire solution which is a very fast and powerful way to find what you’re looking for in your project, by passing the solution explorer altogether. As long as you remember to use (which I sometimes don’t) and you know what you’re looking for it’s by far the quickest way to find things in a project. It’s a shame that this sort of a simple search interface isn’t part of the native Visual Studio IDE. Work how you like to work Ultimately it all comes down to workflow and how you like to work, and what makes *you* more productive. Following pre-defined patterns is great for consistency, as long as they don’t get in the way you work. A lot of the default folder structures in Visual Studio for ASP.NET MVC were defined when things were done differently. These days we’re dealing with a lot more diverse project content than when ASP.NET MVC was originally introduced and project organization definitely is something that can get in the way if it doesn’t fit your workflow. So take a look and see what works well and what might benefit from organizing files differently. As so many things with ASP.NET, as things evolve and tend to get more complex I’ve found that I end up fighting some of the conventions. The good news is that you don’t have to follow the conventions and you have the freedom to do just about anything that works for you. Even though what I’ve shown here diverges from conventions, I don’t think anybody would stumble over these relatively minor changes and not immediately figure out where things live, even in larger projects. But nevertheless think long and hard before breaking those conventions – if there isn’t a good reason to break them or the changes don’t provide improved workflow then it’s not worth it. Break the rules, but only if there’s a quantifiable benefit. You may not agree with how I’ve chosen to divert from the standard project structures in this article, but maybe it gives you some ideas of how you can mix things up to make your existing project flow a little nicer and make it easier to navigate for your environment. © Rick Strahl, West Wind Technologies, 2005-2014Posted in ASP.NET  MVC   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Hello Operator, My Switch Is Bored

    - by Paul White
    This is a post for T-SQL Tuesday #43 hosted by my good friend Rob Farley. The topic this month is Plan Operators. I haven’t taken part in T-SQL Tuesday before, but I do like to write about execution plans, so this seemed like a good time to start. This post is in two parts. The first part is primarily an excuse to use a pretty bad play on words in the title of this blog post (if you’re too young to know what a telephone operator or a switchboard is, I hate you). The second part of the post looks at an invisible query plan operator (so to speak). 1. My Switch Is Bored Allow me to present the rare and interesting execution plan operator, Switch: Books Online has this to say about Switch: Following that description, I had a go at producing a Fast Forward Cursor plan that used the TOP operator, but had no luck. That may be due to my lack of skill with cursors, I’m not too sure. The only application of Switch in SQL Server 2012 that I am familiar with requires a local partitioned view: CREATE TABLE dbo.T1 (c1 int NOT NULL CHECK (c1 BETWEEN 00 AND 24)); CREATE TABLE dbo.T2 (c1 int NOT NULL CHECK (c1 BETWEEN 25 AND 49)); CREATE TABLE dbo.T3 (c1 int NOT NULL CHECK (c1 BETWEEN 50 AND 74)); CREATE TABLE dbo.T4 (c1 int NOT NULL CHECK (c1 BETWEEN 75 AND 99)); GO CREATE VIEW V1 AS SELECT c1 FROM dbo.T1 UNION ALL SELECT c1 FROM dbo.T2 UNION ALL SELECT c1 FROM dbo.T3 UNION ALL SELECT c1 FROM dbo.T4; Not only that, but it needs an updatable local partitioned view. We’ll need some primary keys to meet that requirement: ALTER TABLE dbo.T1 ADD CONSTRAINT PK_T1 PRIMARY KEY (c1);   ALTER TABLE dbo.T2 ADD CONSTRAINT PK_T2 PRIMARY KEY (c1);   ALTER TABLE dbo.T3 ADD CONSTRAINT PK_T3 PRIMARY KEY (c1);   ALTER TABLE dbo.T4 ADD CONSTRAINT PK_T4 PRIMARY KEY (c1); We also need an INSERT statement that references the view. Even more specifically, to see a Switch operator, we need to perform a single-row insert (multi-row inserts use a different plan shape): INSERT dbo.V1 (c1) VALUES (1); And now…the execution plan: The Constant Scan manufactures a single row with no columns. The Compute Scalar works out which partition of the view the new value should go in. The Assert checks that the computed partition number is not null (if it is, an error is returned). The Nested Loops Join executes exactly once, with the partition id as an outer reference (correlated parameter). The Switch operator checks the value of the parameter and executes the corresponding input only. If the partition id is 0, the uppermost Clustered Index Insert is executed, adding a row to table T1. If the partition id is 1, the next lower Clustered Index Insert is executed, adding a row to table T2…and so on. In case you were wondering, here’s a query and execution plan for a multi-row insert to the view: INSERT dbo.V1 (c1) VALUES (1), (2); Yuck! An Eager Table Spool and four Filters! I prefer the Switch plan. My guess is that almost all the old strategies that used a Switch operator have been replaced over time, using things like a regular Concatenation Union All combined with Start-Up Filters on its inputs. Other new (relative to the Switch operator) features like table partitioning have specific execution plan support that doesn’t need the Switch operator either. This feels like a bit of a shame, but perhaps it is just nostalgia on my part, it’s hard to know. Please do let me know if you encounter a query that can still use the Switch operator in 2012 – it must be very bored if this is the only possible modern usage! 2. Invisible Plan Operators The second part of this post uses an example based on a question Dave Ballantyne asked using the SQL Sentry Plan Explorer plan upload facility. If you haven’t tried that yet, make sure you’re on the latest version of the (free) Plan Explorer software, and then click the Post to SQLPerformance.com button. That will create a site question with the query plan attached (which can be anonymized if the plan contains sensitive information). Aaron Bertrand and I keep a close eye on questions there, so if you have ever wanted to ask a query plan question of either of us, that’s a good way to do it. The problem The issue I want to talk about revolves around a query issued against a calendar table. The script below creates a simplified version and adds 100 years of per-day information to it: USE tempdb; GO CREATE TABLE dbo.Calendar ( dt date NOT NULL, isWeekday bit NOT NULL, theYear smallint NOT NULL,   CONSTRAINT PK__dbo_Calendar_dt PRIMARY KEY CLUSTERED (dt) ); GO -- Monday is the first day of the week for me SET DATEFIRST 1;   -- Add 100 years of data INSERT dbo.Calendar WITH (TABLOCKX) (dt, isWeekday, theYear) SELECT CA.dt, isWeekday = CASE WHEN DATEPART(WEEKDAY, CA.dt) IN (6, 7) THEN 0 ELSE 1 END, theYear = YEAR(CA.dt) FROM Sandpit.dbo.Numbers AS N CROSS APPLY ( VALUES (DATEADD(DAY, N.n - 1, CONVERT(date, '01 Jan 2000', 113))) ) AS CA (dt) WHERE N.n BETWEEN 1 AND 36525; The following query counts the number of weekend days in 2013: SELECT Days = COUNT_BIG(*) FROM dbo.Calendar AS C WHERE theYear = 2013 AND isWeekday = 0; It returns the correct result (104) using the following execution plan: The query optimizer has managed to estimate the number of rows returned from the table exactly, based purely on the default statistics created separately on the two columns referenced in the query’s WHERE clause. (Well, almost exactly, the unrounded estimate is 104.289 rows.) There is already an invisible operator in this query plan – a Filter operator used to apply the WHERE clause predicates. We can see it by re-running the query with the enormously useful (but undocumented) trace flag 9130 enabled: Now we can see the full picture. The whole table is scanned, returning all 36,525 rows, before the Filter narrows that down to just the 104 we want. Without the trace flag, the Filter is incorporated in the Clustered Index Scan as a residual predicate. It is a little bit more efficient than using a separate operator, but residual predicates are still something you will want to avoid where possible. The estimates are still spot on though: Anyway, looking to improve the performance of this query, Dave added the following filtered index to the Calendar table: CREATE NONCLUSTERED INDEX Weekends ON dbo.Calendar(theYear) WHERE isWeekday = 0; The original query now produces a much more efficient plan: Unfortunately, the estimated number of rows produced by the seek is now wrong (365 instead of 104): What’s going on? The estimate was spot on before we added the index! Explanation You might want to grab a coffee for this bit. Using another trace flag or two (8606 and 8612) we can see that the cardinality estimates were exactly right initially: The highlighted information shows the initial cardinality estimates for the base table (36,525 rows), the result of applying the two relational selects in our WHERE clause (104 rows), and after performing the COUNT_BIG(*) group by aggregate (1 row). All of these are correct, but that was before cost-based optimization got involved :) Cost-based optimization When cost-based optimization starts up, the logical tree above is copied into a structure (the ‘memo’) that has one group per logical operation (roughly speaking). The logical read of the base table (LogOp_Get) ends up in group 7; the two predicates (LogOp_Select) end up in group 8 (with the details of the selections in subgroups 0-6). These two groups still have the correct cardinalities as trace flag 8608 output (initial memo contents) shows: During cost-based optimization, a rule called SelToIdxStrategy runs on group 8. It’s job is to match logical selections to indexable expressions (SARGs). It successfully matches the selections (theYear = 2013, is Weekday = 0) to the filtered index, and writes a new alternative into the memo structure. The new alternative is entered into group 8 as option 1 (option 0 was the original LogOp_Select): The new alternative is to do nothing (PhyOp_NOP = no operation), but to instead follow the new logical instructions listed below the NOP. The LogOp_GetIdx (full read of an index) goes into group 21, and the LogOp_SelectIdx (selection on an index) is placed in group 22, operating on the result of group 21. The definition of the comparison ‘the Year = 2013’ (ScaOp_Comp downwards) was already present in the memo starting at group 2, so no new memo groups are created for that. New Cardinality Estimates The new memo groups require two new cardinality estimates to be derived. First, LogOp_Idx (full read of the index) gets a predicted cardinality of 10,436. This number comes from the filtered index statistics: DBCC SHOW_STATISTICS (Calendar, Weekends) WITH STAT_HEADER; The second new cardinality derivation is for the LogOp_SelectIdx applying the predicate (theYear = 2013). To get a number for this, the cardinality estimator uses statistics for the column ‘theYear’, producing an estimate of 365 rows (there are 365 days in 2013!): DBCC SHOW_STATISTICS (Calendar, theYear) WITH HISTOGRAM; This is where the mistake happens. Cardinality estimation should have used the filtered index statistics here, to get an estimate of 104 rows: DBCC SHOW_STATISTICS (Calendar, Weekends) WITH HISTOGRAM; Unfortunately, the logic has lost sight of the link between the read of the filtered index (LogOp_GetIdx) in group 22, and the selection on that index (LogOp_SelectIdx) that it is deriving a cardinality estimate for, in group 21. The correct cardinality estimate (104 rows) is still present in the memo, attached to group 8, but that group now has a PhyOp_NOP implementation. Skipping over the rest of cost-based optimization (in a belated attempt at brevity) we can see the optimizer’s final output using trace flag 8607: This output shows the (incorrect, but understandable) 365 row estimate for the index range operation, and the correct 104 estimate still attached to its PhyOp_NOP. This tree still has to go through a few post-optimizer rewrites and ‘copy out’ from the memo structure into a tree suitable for the execution engine. One step in this process removes PhyOp_NOP, discarding its 104-row cardinality estimate as it does so. To finish this section on a more positive note, consider what happens if we add an OVER clause to the query aggregate. This isn’t intended to be a ‘fix’ of any sort, I just want to show you that the 104 estimate can survive and be used if later cardinality estimation needs it: SELECT Days = COUNT_BIG(*) OVER () FROM dbo.Calendar AS C WHERE theYear = 2013 AND isWeekday = 0; The estimated execution plan is: Note the 365 estimate at the Index Seek, but the 104 lives again at the Segment! We can imagine the lost predicate ‘isWeekday = 0’ as sitting between the seek and the segment in an invisible Filter operator that drops the estimate from 365 to 104. Even though the NOP group is removed after optimization (so we don’t see it in the execution plan) bear in mind that all cost-based choices were made with the 104-row memo group present, so although things look a bit odd, it shouldn’t affect the optimizer’s plan selection. I should also mention that we can work around the estimation issue by including the index’s filtering columns in the index key: CREATE NONCLUSTERED INDEX Weekends ON dbo.Calendar(theYear, isWeekday) WHERE isWeekday = 0 WITH (DROP_EXISTING = ON); There are some downsides to doing this, including that changes to the isWeekday column may now require Halloween Protection, but that is unlikely to be a big problem for a static calendar table ;)  With the updated index in place, the original query produces an execution plan with the correct cardinality estimation showing at the Index Seek: That’s all for today, remember to let me know about any Switch plans you come across on a modern instance of SQL Server! Finally, here are some other posts of mine that cover other plan operators: Segment and Sequence Project Common Subexpression Spools Why Plan Operators Run Backwards Row Goals and the Top Operator Hash Match Flow Distinct Top N Sort Index Spools and Page Splits Singleton and Range Seeks Bitmaps Hash Join Performance Compute Scalar © 2013 Paul White – All Rights Reserved Twitter: @SQL_Kiwi

    Read the article

  • The UIManager Pattern

    - by Duncan Mills
    One of the most common mistakes that I see when reviewing ADF application code, is the sin of storing UI component references, most commonly things like table or tree components in Session or PageFlow scope. The reasons why this is bad are simple; firstly, these UI object references are not serializable so would not survive a session migration between servers and secondly there is no guarantee that the framework will re-use the same component tree from request to request, although in practice it generally does do so. So there danger here is, that at best you end up with an NPE after you session has migrated, and at worse, you end up pinning old generations of the component tree happily eating up your precious memory. So that's clear, we should never. ever, be storing references to components anywhere other than request scope (or maybe backing bean scope). So double check the scope of those binding attributes that map component references into a managed bean in your applications.  Why is it Such a Common Mistake?  At this point I want to examine why there is this urge to hold onto these references anyway? After all, JSF will obligingly populate your backing beans with the fresh and correct reference when needed.   In most cases, it seems that the rational is down to a lack of distinction within the application between what is data and what is presentation. I think perhaps, a cause of this is the logical separation between business data behind the ADF data binding (#{bindings}) façade and the UI components themselves. Developers tend to think, OK this is my data layer behind the bindings object and everything else is just UI.  Of course that's not the case.  The UI layer itself will have state which is intrinsically linked to the UI presentation rather than the business model, but at the same time should not be tighly bound to a specific instance of any single UI component. So here's the problem.  I think developers try and use the UI components as state-holders for this kind of data, rather than using them to represent that state. An example of this might be something like the selection state of a tabset (panelTabbed), you might be interested in knowing what the currently disclosed tab is. The temptation that leads to the component reference sin is to go and ask the tabset what the selection is.  That of course is fine in context - e.g. a handler within the same request scoped bean that's got the binding to the tabset. However, it leads to problems when you subsequently want the same information outside of the immediate scope.  The simple solution seems to be to chuck that component reference into session scope and then you can simply re-check in the same way, leading of course to this mistake. Turn it on its Head  So the correct solution to this is to turn the problem on its head. If you are going to be interested in the value or state of some component outside of the immediate request context then it becomes persistent state (persistent in the sense that it extends beyond the lifespan of a single request). So you need to externalize that state outside of the component and have the component reference and manipulate that state as needed rather than owning it. This is what I call the UIManager pattern.  Defining the Pattern The  UIManager pattern really is very simple. The premise is that every application should define a session scoped managed bean, appropriately named UIManger, which is specifically responsible for holding this persistent UI component related state.  The actual makeup of the UIManger class varies depending on a needs of the application and the amount of state that needs to be stored. Generally I'll start off with a Map in which individual flags can be created as required, although you could opt for a more formal set of typed member variables with getters and setters, or indeed a mix. This UIManager class is defined as a session scoped managed bean (#{uiManager}) in the faces-config.xml.  The pattern is to then inject this instance of the class into any other managed bean (usually request scope) that needs it using a managed property.  So typically you'll have something like this:   <managed-bean>     <managed-bean-name>uiManager</managed-bean-name>     <managed-bean-class>oracle.demo.view.state.UIManager</managed-bean-class>     <managed-bean-scope>session</managed-bean-scope>   </managed-bean>  When is then injected into any backing bean that needs it:    <managed-bean>     <managed-bean-name>mainPageBB</managed-bean-name>     <managed-bean-class>oracle.demo.view.MainBacking</managed-bean-class>     <managed-bean-scope>request</managed-bean-scope>     <managed-property>       <property-name>uiManager</property-name>       <property-class>oracle.demo.view.state.UIManager</property-class>       <value>#{uiManager}</value>     </managed-property>   </managed-bean> In this case the backing bean in question needs a member variable to hold and reference the UIManager: private UIManager _uiManager;  Which should be exposed via a getter and setter pair with names that match the managed property name (e.g. setUiManager(UIManager _uiManager), getUiManager()).  This will then give your code within the backing bean full access to the UI state. UI components in the page can, of course, directly reference the uiManager bean in their properties, for example, going back to the tab-set example you might have something like this: <af:paneltabbed>   <af:showDetailItem text="First"                disclosed="#{uiManager.settings['MAIN_TABSET_STATE'].['FIRST']}"> ...   </af:showDetailItem>   <af:showDetailItem text="Second"                      disclosed="#{uiManager.settings['MAIN_TABSET_STATE'].['SECOND']}">     ...   </af:showDetailItem>   ... </af:panelTabbed> Where in this case the settings member within the UI Manger is a Map which contains a Map of Booleans for each tab under the MAIN_TABSET_STATE key. (Just an example you could choose to store just an identifier for the selected tab or whatever, how you choose to store the state within UI Manger is up to you.) Get into the Habit So we can see that the UIManager pattern is not great strain to implement for an application and can even be retrofitted to an existing application with ease. The point is, however, that you should always take this approach rather than committing the sin of persistent component references which will bite you in the future or shotgun scattered UI flags on the session which are hard to maintain.  If you take the approach of always accessing all UI state via the uiManager, or perhaps a pageScope focused variant of it, you'll find your applications much easier to understand and maintain. Do it today!

    Read the article

  • April 30th Links: ASP.NET, ASP.NET MVC, Visual Studio 2010

    - by ScottGu
    Here is the latest in my link-listing series. [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] ASP.NET Data Web Control Enhancements in ASP.NET 4.0: Scott Mitchell has a good article that summarizes some of the nice improvements coming to the ASP.NET 4 data controls. Refreshing an ASP.NET AJAX UpdatePanel with JavaScript: Scott Mitchell has another nice article in his series on using ASP.NET AJAX that demonstrates how to programmatically trigger an UpdatePanel refresh using JavaScript on the client. ASP.NET MVC ASP.NET MVC 2: Basics and Introduction: Scott Hanselman delivers an awesome introductory talk on ASP.NET MVC.  Great for people looking to understand and learn ASP.NET MVC. ASP.NET MVC 2: Ninja Black Belt Tips: Another great talk by Scott Hanselman about how to make the most of several features of ASP.NET MVC 2. ASP.NET MVC 2 Html.Editor/Display Templates: A great blog post detailing the new Html.EditorFor() and Html.DisplayFor() helpers within ASP.NET MVC 2. MVCContrib Grid: Jeremy Skinner’s video presentation about the new Html.Grid() helper component within the (most awesome) MvcContrib project for ASP.NET MVC. Code Snippets for ASP.NET MVC 2 in VS 2010: Raj Kaimal documents some of the new code snippets for ASP.NET MVC 2 that are now built-into Visual Studio 2010.  Read this article to learn how to do common scenarios with fewer keystrokes. Turn on Compile-time View Checking for ASP.NET MVC Projects in TFS 2010 Build: Jim Lamb has a nice post that describes how to enable compile-time view checking as part of automated builds done with a TFS Build Server.  This will ensure any errors in your view templates raise build-errors (allowing you to catch them at build-time instead of runtime). Visual Studio 2010 VS 2010 Keyboard Shortcut Posters for VB, C#, F# and C++: Keyboard shortcut posters that you can download and then printout. Ideal to provide a quick reference on your desk for common keystroke actions inside VS 2010. My Favorite New Features in VS 2010: Scott Mitchell has a nice article that summarizes some of his favorite new features in VS 2010.  Check out my VS 2010 and .NET 4 blog series for more details on some of them. 6 Cool VS 2010 Quick Tips and Features: Anoop has a nice blog post describing 6 cool features of VS 2010 that you can take advantage of. SharePoint Development with VS 2010: Beth Massi links to a bunch of nice “How do I?” videos that that demonstrate how to use the SharePoint development support built-into VS 2010. How to Pin a Project to the Recent Projects List in VS 2010: A useful tip/trick that demonstrates how to “pin” a project to always show up on the “Recent Projects” list within Visual Studio 2010. Using the WPF Tree Visualizer in VS 2010: Zain blogs about the new WPF Tree Visualizer supported by the VS 2010 debugger.  This makes it easier to visualize WPF control hierarchies within the debugger. TFS 2010 Power Tools Released: Brian Harry blogs about the cool new TFS 2010 extensions released with this week’s TFS 2010 Power Tools release. What is New with T4 in VS 2010: T4 is the name of Visual Studio’s template-based code generation technology.  Lots of scenarios within VS 2010 now use T4 for code generation customization. Two examples are ASP.NET MVC Views and EF4 Model Generation.  This post describes some of the many T4 infrastructure improvements in VS 2010. Hope this helps, Scott P.S. If you haven’t already, check out this month’s "Find a Hoster” page on the www.asp.net website to learn about great (and very inexpensive) ASP.NET hosting offers.

    Read the article

  • Can't install git on Ubuntu 12.10

    - by Lucas Windir
    I'm following these instructions to install git on my laptop: http://git-scm.com/download/linux When I do: $ sudo apt-get install git-core This is what my terinal shows: Reading package lists... Done Building dependency tree Reading state information... Done The following packages were automatically installed and are no longer required: libasprintf0c2:i386 libcroco3:i386 libgettextpo0:i386 libgomp1:i386 libunistring0:i386 Use 'apt-get autoremove' to remove them. The following extra packages will be installed: git git-man liberror-perl Suggested packages: git-daemon-run git-daemon-sysvinit git-doc git-el git-arch git-cvs git-svn git-email git-gui gitk gitweb The following NEW packages will be installed: git git-core git-man liberror-perl 0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded. Need to get 6,825 kB of archives. After this operation, 15.3 MB of additional disk space will be used. Do you want to continue [Y/n]? y WARNING: The following packages cannot be authenticated! liberror-perl git-man git git-core Install these packages without verification [y/N]? E: Some packages could not be authenticated lucas@lucas-Inspiron-N5050:~$ sudo apt-get install git-core Reading package lists... Done Building dependency tree Reading state information... Done The following packages were automatically installed and are no longer required: libasprintf0c2:i386 libcroco3:i386 libgettextpo0:i386 libgomp1:i386 libunistring0:i386 Use 'apt-get autoremove' to remove them. The following extra packages will be installed: git git-man liberror-perl Suggested packages: git-daemon-run git-daemon-sysvinit git-doc git-el git-arch git-cvs git-svn git-email git-gui gitk gitweb The following NEW packages will be installed: git git-core git-man liberror-perl 0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded. Need to get 6,825 kB of archives. After this operation, 15.3 MB of additional disk space will be used. Do you want to continue [Y/n]? y WARNING: The following packages cannot be authenticated! liberror-perl git-man git git-core Install these packages without verification [y/N]? y Err httpq://py.archive.ubuntu.com/ubuntu/ quantal/main liberror-perl all 0.17-1 Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Err httpq://py.archive.ubuntu.com/ubuntu/ quantal/main git-man all 1:1.7.10.4-1ubuntu1 Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Err httpq://py.archive.ubuntu.com/ubuntu/ quantal/main git amd64 1:1.7.10.4-1ubuntu1 Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Err httpq://py.archive.ubuntu.com/ubuntu/ quantal/main git-core all 1:1.7.10.4-1ubuntu1 Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Failed to fetch httpq://py.archive.ubuntu.com/ubuntu/pool/main/libe/liberror-perl/liberrorperl_0.17-1_all.deb Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Failed to fetch httpq://py.archive.ubuntu.com/ubuntu/pool/main/g/git/git-man_1.7.10.4-1ubuntu1_all.deb Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Failed to fetch httpq://py.archive.ubuntu.com/ubuntu/pool/main/g/git/git_1.7.10.4-1ubuntu1_amd64.deb Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) Failed to fetch http://py.archive.ubuntu.com/ubuntu/pool/main/g/git/git-core_1.7.10.4-1ubuntu1_all.deb Something wicked happened resolving 'py.archive.ubuntu.com:http' (-5 - No address associated with hostname) E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? How could I install git on Ubuntu 12.10? I can't even do it from the Ubuntu Software Center. Thanks in advance!

    Read the article

  • Stuck with Apache2

    - by Gundars Meness
    I cant finish Apache2 install, also cannot remove it. It has blocked my dpkg, now I cant get no installations in or out. I even tried distro upgrade, but it did still has broken dpkg.. How to fix this and get normal Apache2 running? Just for the heck of it: gundars@SR528:~$ sudo apt-get remove apache2-common Reading package lists... Done Building dependency tree Reading state information... Done Package 'apache2-common' is not installed, so not removed 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded. 2 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Setting up apache2.2-common (2.2.22-6ubuntu2) ... ERROR: Site default does not exist! dpkg: error processing apache2.2-common (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of apache2-mpm-prefork: apache2-mpm-prefork depends on apache2.2-common (= 2.2.22-6ubuntu2); however: Package apache2.2-common is not configured yet. dpkg: error processing apache2-mpm-prefork (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: apache2.2-common apache2-mpm-prefork E: Sub-process /usr/bin/dpkg returned an error code (1) sudo apt-get -f install apache2 apache2.2-common apache2-mpm-prefork [sudo] password for gundars: Reading package lists... Done Building dependency tree Reading state information... Done apache2 is already the newest version. apache2-mpm-prefork is already the newest version. apache2.2-common is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded. 4 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? y Setting up apache2.2-common (2.2.22-6ubuntu2) ... ERROR: Site default does not exist! dpkg: error processing apache2.2-common (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of apache2-mpm-prefork: apache2-mpm-prefork depends on apache2.2-common (= 2.2.22-6ubuntu2); however: Package apache2.2-common is not configured yet. dpkg: error processing apache2-mpm-prefork (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of apache2: apache2 depends on apache2-mpm-worker (= 2.2.22-6ubuntu2) | apache2-mpm-prefork (= 2.2.22-6ubuntu2) | apache2-mpm-event (= 2.2.22-6ubuntu2) | apache2-mpm-itk (= 2.2.22-6ubuntu2); however: Package apache2-mpm-worker is not installed. Package apache2-mpm-prefork is not configured yet. Package apache2-mpm-event is not installed. Package apache2-mpm-itk is not installed. apache2 depends on apache2.2-common (= 2.2.22-6ubuntu2); however: Package apache2.2-common is not configured yet. dpkg: error processing apache2 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libapache2-mod-php5: libapache2-mod-php5 depends on apache2-mpm-prefork (>> 2.0.52) | apache2-mpm-itk; however: Package apache2-mpm-prefork is not configured yet. Package apache2-mpm-itk is not installed. libapache2-mod-php5 depends on apache2.2-common; however: Package apache2.2-common is not configured yet. No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already dpkg: error processing libapache2-mod-php5 (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: apache2.2-common apache2-mpm-prefork apache2 libapache2-mod-php5 E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • URGENT: Firefox circular-dependency hell - Linux Mint 13 (based on Ubuntu 12.04)

    - by Tyler J Fisher
    Having difficulty re-installing Firefox, after an installation to resolve places.sqlite issues. It appears that I'm trapped in circular dependency hell. Need to resolve firefox dependency hell to attempt to resolve Tomcat6 project dependencies (don't ask), ASAP. Have been trying for hours. What I've done (brief) 1) sudo apt-get purge firefox firefox-globalmenu firefox-gnome-support 2) sudo apt-get update 3) sudo apt-get install firefox firefox-globalmenu firefox-gnome-support 4) sudo apt-get -f install Potential error sources: Found in(sudo apt-get install firefox firefox-globalmenu firefox-gnome-support) dpkg: error processing /var/cache/apt/archives/firefox_18.0~a2~hg20121027r113701-0ubuntu1~umd1~precise_amd64.deb (--unpack): trying to overwrite '/usr/lib/firefox/extensions', which is also in package mint-search-addon 2012.05.11 So, /usr/lib/firefox/extensions doesn't even EXIST! Deleted /var/cache/apt/archives/firefox_18.0~a2~hg20121027r113701 as per recommendations. Errors were encountered while processing: /var/cache/apt/archives/firefox_18.0~a2~hg20121027r113701-0ubuntu1~umd1~precise_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) Outputs: 1) sudo apt-get purge firefox firefox-globalmenu firefox-gnome-support me@machine ~ $ sudo apt-get purge firefox-gnome-support firefox firefox-globalmenu Reading package lists... Done Building dependency tree Reading state information... Done Package firefox is not installed, so not removed The following packages will be REMOVED: firefox-globalmenu* firefox-gnome-support* 2 not fully installed or removed. 0 upgraded, 0 newly installed, 2 to remove and 38 not upgraded. After this operation, 460 kB disk space will be freed. Do you want to continue [Y/n]? y (Reading database ... dpkg: warning: files list file for package `mysqltuner' missing, assuming package has no files currently installed. (Reading database ... 192642 files and directories currently installed.) Removing firefox-globalmenu ... Removing firefox-gnome-support ... 3) me@machine ~ $ sudo apt-get install firefox firefox-globalmenu firefox-gnome-support Reading package lists... Done Building dependency tree Reading state information... Done Suggested packages: latex-xft-fonts The following NEW packages will be installed: firefox firefox-globalmenu firefox-gnome-support 0 upgraded, 3 newly installed, 0 to remove and 38 not upgraded. Need to get 0 B/24.8 MB of archives. After this operation, 54.3 MB of additional disk space will be used. (Reading database ... dpkg: warning: files list file for package `mysqltuner' missing, assuming package has no files currently installed. (Reading database ... 192619 files and directories currently installed.) Unpacking firefox (from .../firefox_18.0~a2~hg20121027r113701-0ubuntu1~umd1~precise_amd64.deb) ... dpkg: error processing /var/cache/apt/archives/firefox_18.0~a2~hg20121027r113701-0ubuntu1~umd1~precise_amd64.deb (--unpack): trying to overwrite '/usr/lib/firefox/extensions', which is also in package mint-search-addon 2012.05.11 Selecting previously unselected package firefox-globalmenu. Unpacking firefox-globalmenu (from .../firefox-globalmenu_18.0~a2~hg20121027r113701-0ubuntu1~umd1~precise_amd64.deb) ... Selecting previously unselected package firefox-gnome-support. Unpacking firefox-gnome-support (from .../firefox-gnome- support_18.0~a2~hg20121027r113701-0ubuntu1~umd1~precise_amd64.deb) ... Processing triggers for man-db ... Processing triggers for desktop-file-utils ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for gnome-menus ... Processing triggers for mintsystem ... Errors were encountered while processing: /var/cache/apt/archives/firefox_18.0~a2~hg20121027r113701- 0ubuntu1~umd1~precise_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) 4) sudo apt-get -f install 0 upgraded, 0 newly installed, 0 to remove, and 38 not upgraded Ideas? Tomcat6 only deploys my web application successfully in Firefox, not Chrome, so I'm really hoping to resolve this dependency issue.

    Read the article

  • OWB 11gR2 - Find and Search Metadata in Designer

    - by David Allan
    Here are some tools and techniques for finding objects, specifically in the design repository. There are ways of navigating and collating objects that are useful for day to day development and build-time usage - this includes features out of the box and utilities constructed on top. There are a variety of techniques to navigate and find objects in the repository, the first 3 are out of the box, the 4th is an expert utility. Navigating by the tree, grouping by project and module - ok if you are aware of the exact module/folder that objects reside in. The structure panel is a useful way of finding parts of an object, especially when large rather than using the canvas. In large scale projects it helps to have accelerators (either find or collections below). Advanced find to search by name - 11gR2 included a find capability specifically for large scale projects. There were improvements in both the tree search and the object editors (including highlighting in mapping for example). So you can now do regular expression based search and quickly navigate to objects within a repository. Collections - logically organize your objects into virtual folders by shortcutting the actual objects. This is useful for a range of things since all the OWB services operate on collections too (export/import, validation, deployment). See the post here for new collection functionality in 11gR2. Reports for searching by type, updated on, updated by etc. Useful for activities such as periodic incremental actions (deploy all mappings changed in the past week). The report style view is useful since I can quickly see who changed what and when. You can see all the audit details for objects within each objects property inspector, but its useful to just get all objects changed today or example, all objects changed since my last build etc. This utility combines both UI extensions via experts and the public views on the repository. In the figure to the right you see the contextual option 'Object Search' which invokes the utility, you can see I have quite a number of modules within my project. Figure out all the potential objects which have been changed is not simple. The utility is an expert which provides this kind of search capability. The utility provides a report of the objects in the design repository which satisfy some filter criteria. The type of criteria includes; objects updated in the last n days optionally filter the objects updated by user filter the user by project and by type (table/mappings etc.) The search dialog appears with these options, you can multi-select the object types, so for example you can select TABLE and MAPPING. Its also possible to search across projects if need be. If you have multiple users using the repository you can define the OWB user name in the 'Updated by' property to restrict the report to just that user also. Finally there is a search name that will be used for some of the options such as building a collection - this name is used for the collection to be built. In the example I have done, I've just searched my project for all process flows and mappings that users have updated in the last 7 days. The results of the query are returned in a table containing the object names, types, full path and audit details. The columns are sort-able, you can sort the results by name, type, path etc. One of the cool things here, is that you can then perform operations on these objects - such as edit them, export single selection or entire results to MDL, create a collection from the results (now you have a saved set of references in the repository, you could do deploy/export etc.), create a deployment script from the results...or even add in your own ideas! You see from this that you can do bulk operations on sets of objects based on search results. So for example selecting the 'Build Collection' option creates a collection with all of the objects from my search, you can subsequently deploy/generate/maintain this collection of objects. Under the hood of the expert if just basic OMB commands from the product and the use of the public views on the design repository. You can see how easy it is to build up macro-like capabilities that will help you do day-to-day as well as build like tasks on sets of objects.

    Read the article

  • Useful SVN and Git commands – Cheatsheet

    - by Madhan ayyasamy
    The following snippets will helpful one who user version control systems like Git and SVN.svn checkout/co checkout-url – used to pull an SVN tree from the server.svn update/up – Used to update the local copy with the changes made in the repository.svn commit/ci – m “message” filename – Used to commit the changes in a file to repository with a message.svn diff filename – shows up the differences between your current file and what’s there now in the repository.svn revert filename – To overwrite local file with the one in the repository.svn add filename – For adding a file into repository, you should commit your changes then only it will reflect in repository.svn delete filename – For deleting a file from repository, you should commit your changes then only it will reflect in repository.svn move source destination – moves a file from one directory to another or renames a file. It will effect your local copy immediately as well as on the repository after committing.git config – Sets configuration values for your user name, email, file formats and more.git init – Initializes a git repository – creates the initial ‘.git’ directory in a new or in an existing project.git clone – Makes a Git repository copy from a remote source. Also adds the original location as a remote so you can fetch from it again and push to it if you have permissions.git add – Adds files changes in your working directory to your index.git rm – Removes files from your index and your working directory so they will not be tracked.git commit – Takes all of the changes written in the index, creates a new commit object pointing to it and sets the branch to point to that new commit.git status – Shows you the status of files in the index versus the working directory.git branch – Lists existing branches, including remote branches if ‘-a’ is provided. Creates a new branch if a branch name is provided.git checkout – Checks out a different branch – switches branches by updating the index, working tree, and HEAD to reflect the chosen branch.git merge – Merges one or more branches into your current branch and automatically creates a new commit if there are no conflicts.git reset – Resets your index and working directory to the state of your last commit.git tag – Tags a specific commit with a simple, human readable handle that never moves.git pull – Fetches the files from the remote repository and merges it with your local one.git push – Pushes all the modified local objects to the remote repository and advances its branches.git remote – Shows all the remote versions of your repository.git log – Shows a listing of commits on a branch including the corresponding details.git show – Shows information about a git object.git diff – Generates patch files or statistics of differences between paths or files in your git repository, or your index or your working directory.gitk – Graphical Tcl/Tk based interface to a local Git repository.

    Read the article

  • Dual Screen will only mirror after 12.04 upgrade

    - by Ne0
    I have been using Ubuntu with a dual screen for years now, after upgrading to 12.04 LTS i cannot get my dual screen working properly Graphics: 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI RV350 AR [Radeon 9600] 01:00.1 Display controller: Advanced Micro Devices [AMD] nee ATI RV350 AR [Radeon 9600] (Secondary) I noticed i was using open source drivers and attempted to install official binaries using the methods in this thread. Output: liam@liam-desktop:~$ sudo apt-get install fglrx fglrx-amdcccle Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be upgraded: fglrx fglrx-amdcccle 2 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. Need to get 45.1 MB of archives. After this operation, 739 kB of additional disk space will be used. Get:1 http://gb.archive.ubuntu.com/ubuntu/ precise/restricted fglrx i386 2:8.960-0ubuntu1 [39.2 MB] Get:2 http://gb.archive.ubuntu.com/ubuntu/ precise/restricted fglrx-amdcccle i386 2:8.960-0ubuntu1 [5,883 kB] Fetched 45.1 MB in 1min 33s (484 kB/s) (Reading database ... 328081 files and directories currently installed.) Preparing to replace fglrx 2:8.951-0ubuntu1 (using .../fglrx_2%3a8.960-0ubuntu1_i386.deb) ... Removing all DKMS Modules Error! There are no instances of module: fglrx 8.951 located in the DKMS tree. Done. Unpacking replacement fglrx ... Preparing to replace fglrx-amdcccle 2:8.951-0ubuntu1 (using .../fglrx-amdcccle_2%3a8.960-0ubuntu1_i386.deb) ... Unpacking replacement fglrx-amdcccle ... Processing triggers for ureadahead ... ureadahead will be reprofiled on next reboot Setting up fglrx (2:8.960-0ubuntu1) ... update-alternatives: warning: forcing reinstallation of alternative /usr/lib/fglrx/ld.so.conf because link group i386-linux-gnu_gl_conf is broken. update-alternatives: warning: skip creation of /etc/OpenCL/vendors/amdocl64.icd because associated file /usr/lib/fglrx/etc/OpenCL/vendors/amdocl64.icd (of link group i386-linux-gnu_gl_conf) doesn't exist. update-alternatives: warning: skip creation of /usr/lib32/libaticalcl.so because associated file /usr/lib32/fglrx/libaticalcl.so (of link group i386-linux-gnu_gl_conf) doesn't exist. update-alternatives: warning: skip creation of /usr/lib32/libaticalrt.so because associated file /usr/lib32/fglrx/libaticalrt.so (of link group i386-linux-gnu_gl_conf) doesn't exist. update-alternatives: warning: forcing reinstallation of alternative /usr/lib/fglrx/ld.so.conf because link group i386-linux-gnu_gl_conf is broken. update-alternatives: warning: skip creation of /etc/OpenCL/vendors/amdocl64.icd because associated file /usr/lib/fglrx/etc/OpenCL/vendors/amdocl64.icd (of link group i386-linux-gnu_gl_conf) doesn't exist. update-alternatives: warning: skip creation of /usr/lib32/libaticalcl.so because associated file /usr/lib32/fglrx/libaticalcl.so (of link group i386-linux-gnu_gl_conf) doesn't exist. update-alternatives: warning: skip creation of /usr/lib32/libaticalrt.so because associated file /usr/lib32/fglrx/libaticalrt.so (of link group i386-linux-gnu_gl_conf) doesn't exist. update-initramfs: deferring update (trigger activated) update-initramfs: Generating /boot/initrd.img-3.2.0-25-generic-pae Loading new fglrx-8.960 DKMS files... Building only for 3.2.0-25-generic-pae Building for architecture i686 Building initial module for 3.2.0-25-generic-pae Done. fglrx: Running module version sanity check. - Original module - No original module exists within this kernel - Installation - Installing to /lib/modules/3.2.0-25-generic-pae/updates/dkms/ depmod....... DKMS: install completed. update-initramfs: deferring update (trigger activated) Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Setting up fglrx-amdcccle (2:8.960-0ubuntu1) ... Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.2.0-25-generic-pae Processing triggers for libc-bin ... ldconfig deferred processing now taking place liam@liam-desktop:~$ sudo aticonfig --initial -f aticonfig: No supported adapters detected When i attempt to get my settings back to what they were before upgrading i get this message requested position/size for CRTC 81 is outside the allowed limit: position=(1440, 0), size=(1440, 900), maximum=(1680, 1680) and GDBus.Error:org.gtk.GDBus.UnmappedGError.Quark._gnome_2drr_2derror_2dquark.Code3: requested position/size for CRTC 81 is outside the allowed limit: position=(1440, 0), size=(1440, 900), maximum=(1680, 1680) Any idea's on what i need to do to fix this issue?

    Read the article

  • Silverlight Cream for March 08, 2011 -- #1056

    - by Dave Campbell
    In this Issue: Joost van Schaik, Manas Patnaik, Kevin Hoffman, Jesse Liberty, Deborah Kurata, Dhananjay Kumar, Dennis Delimarsky, Samuel Jack, Peter Kuhn, WindowsPhoneGeek, and Jfo. Above the Fold: Silverlight: "How I let the trees grow" Peter Kuhn WP7: "Simple Windows Phone 7 / Silverlight drag/flick behavior" Joost van Schaik Shoutouts: SilverlightShow has their top 5 from last week posted, plus the ECOContest is ready to be voted on: SilverlightShow for Feb 28 - March 06, 2011 Drew DeVault is a young man involved with the Microsoft Student Insiders. He gave a WP7 presentation at RMTT and has posted his material: Post-Session: Windows Phone 7 @ RMTT Rui Marinho has an app in the ECO Contest called Forest Findr. is based on the BIng Map Control for silverlight and Sql Spatial data, and helps you find Forests and get geolocated pictures and wikipedia information, and has a post up with a bunch of info on it here: Forest Findr. my entry on the SilverlightShow EcoContest From SilverlightCream.com: Simple Windows Phone 7 / Silverlight drag/flick behavior Joost van Schaik has a behavior that makes *anything* draggable and 'flickable' in WP7 ... read the intro, scroll to the bottom to watch the demo, and then grab up the code... cool stuff, Joost! Data Aggregation Using Presentation Model in RIA and Silverlight 4 Manas Patnaik sent me a link to his blog, and it appears he's got lots of Silverlight goodness out there so you'll be hearing more about him. This first post is on the Presentation Model in RIA and Silverlight 4... good discussion, diagrams and code... good job, Manas! WP7 for iPhone and Android Developers - Advanced UI Kevin Hoffman has part 3 of an ambitious 12-part tutorial series up on WP7 development ... this go-around is concentrating on Advanced UI - Panorama/Pivot controls, DataBinding, ObservableCollections, and Converters... whew! Sterling DB on top of Isolated Storage – 2 Jesse Liberty has part 2 of his Sterling series up... this time setting up the database in App.xaml so it can be used for dealing with tombstoning. Silverlight Charting: Formatting the Tick Marks Deborah Kurata's next chart tutorial is all about showing you how to continue to dress up your charts.. this time by formatting the tick marks... if you don't know what that is... check out the first image in the post. Stored Procedure in WCF Data Service Dhananjay Kumar has a very nice tutorial up on using a stored proc with WCF Data Services... I happen to know someone working on just that at this time. If you have this in mind, here's a step-by-step guide to getting it done. Windows Phone 7 – Episode 5 – Pages Dennis Delimarsky has part 5 of his WP7 tutorial series up and is discussing Pages in this 17 minute video. Unpacking Simon Squared: My mini framework-independent animation library Samuel Jack has not only Open-Sourced the WP7 game he built and blogged about, but he's now explaining some of the structure of the game in posts such as this one about the animation library he wrote that his game is built on. How I let the trees grow Peter Kuhn shares with us the code he used for the tree animation in his ECO Contest entry. There's a lot to learn in this post about performance ... the fully-animated tree has about 20K elements... 5K branches and 20K leaves... check it out. WP7 ToastPrompt in depth WindowsPhoneGeek takes a deep dive into the ToastPrompt control in the Coding4fun Toolkit... everything you need to completely use the control including sample code. Beware the loaded event Jfo talks about another frustration point she had with WP7 development, and that is around the use of the loaded event... read these tips from someone that's been there. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Java Hint in NetBeans for Identifying JOptionPanes

    - by Geertjan
    I tend to have "JOptionPane.showMessageDialogs" scattered through my code, for debugging purposes. Now I have a way to identify all of them and remove them one by one, since some of them are there for users of the application so shouldn't be removed, via the Refactoring window: Identifying instances of code that I'm interested in is really trivial: import org.netbeans.spi.editor.hints.ErrorDescription; import org.netbeans.spi.java.hints.ConstraintVariableType; import org.netbeans.spi.java.hints.ErrorDescriptionFactory; import org.netbeans.spi.java.hints.Hint; import org.netbeans.spi.java.hints.HintContext; import org.netbeans.spi.java.hints.TriggerPattern; import org.openide.util.NbBundle.Messages; @Hint( displayName = "#DN_ShowMessageDialogChecker", description = "#DESC_ShowMessageDialogChecker", category = "general") @Messages({ "DN_ShowMessageDialogChecker=Found \"ShowMessageDialog\"", "DESC_ShowMessageDialogChecker=Checks for JOptionPane.showMes" }) public class ShowMessageDialogChecker { @TriggerPattern(value = "$1.showMessageDialog", constraints = @ConstraintVariableType(variable = "$1", type = "javax.swing.JOptionPane")) @Messages("ERR_ShowMessageDialogChecker=Are you sure you need this statement?") public static ErrorDescription computeWarning(HintContext ctx) { return ErrorDescriptionFactory.forName( ctx, ctx.getPath(), Bundle.ERR_ShowMessageDialogChecker()); } } Stick the above class, which seriously isn't much code at all, in a module and run it, with this result: Bit trickier to do the fix, i.e., add a bit of code to let the user remove the statement, but I looked in the NetBeans sources and used the System.out fix, which does the same thing:  import com.sun.source.tree.BlockTree; import com.sun.source.tree.StatementTree; import com.sun.source.util.TreePath; import java.util.ArrayList; import java.util.Iterator; import java.util.List; import org.netbeans.api.java.source.CompilationInfo; import org.netbeans.api.java.source.WorkingCopy; import org.netbeans.spi.editor.hints.ErrorDescription; import org.netbeans.spi.editor.hints.Fix; import org.netbeans.spi.java.hints.ConstraintVariableType; import org.netbeans.spi.java.hints.ErrorDescriptionFactory; import org.netbeans.spi.java.hints.Hint; import org.netbeans.spi.java.hints.HintContext; import org.netbeans.spi.java.hints.JavaFix; import org.netbeans.spi.java.hints.TriggerPattern; import org.openide.util.NbBundle.Messages; @Hint( displayName = "#DN_ShowMessageDialogChecker", description = "#DESC_ShowMessageDialogChecker", category = "general") @Messages({ "DN_ShowMessageDialogChecker=Found \"ShowMessageDialog\"", "DESC_ShowMessageDialogChecker=Checks for JOptionPane.showMes" }) public class ShowMessageDialogChecker { @TriggerPattern(value = "$1.showMessageDialog", constraints = @ConstraintVariableType(variable = "$1", type = "javax.swing.JOptionPane")) @Messages("ERR_ShowMessageDialogChecker=Are you sure you need this statement?") public static ErrorDescription computeWarning(HintContext ctx) { Fix fix = new FixImpl(ctx.getInfo(), ctx.getPath()).toEditorFix(); return ErrorDescriptionFactory.forName( ctx, ctx.getPath(), Bundle.ERR_ShowMessageDialogChecker(), fix); } private static final class FixImpl extends JavaFix { public FixImpl(CompilationInfo info, TreePath tp) { super(info, tp); } @Override @Messages("FIX_ShowMessageDialogChecker=Remove the statement") protected String getText() { return Bundle.FIX_ShowMessageDialogChecker(); } @Override protected void performRewrite(TransformationContext tc) throws Exception { WorkingCopy wc = tc.getWorkingCopy(); TreePath statementPath = tc.getPath(); TreePath blockPath = tc.getPath().getParentPath(); while (!(blockPath.getLeaf() instanceof BlockTree)) { statementPath = blockPath; blockPath = blockPath.getParentPath(); if (blockPath == null) { return; } } BlockTree blockTree = (BlockTree) blockPath.getLeaf(); List<? extends StatementTree> statements = blockTree.getStatements(); List<StatementTree> newStatements = new ArrayList<StatementTree>(); for (Iterator<? extends StatementTree> it = statements.iterator(); it.hasNext();) { StatementTree statement = it.next(); if (statement != statementPath.getLeaf()) { newStatements.add(statement); } } BlockTree newBlockTree = wc.getTreeMaker().Block(newStatements, blockTree.isStatic()); wc.rewrite(blockTree, newBlockTree); } } } Aside from now being able to use "Inspect & Refactor" to identify and fix all instances of JOptionPane.showMessageDialog at the same time, you can also do the fixes per instance within the editor:

    Read the article

  • How to "apt-get -f install" without deleting software?

    - by Jeggy
    I know Guitar pro doesn't support 64 bit, but i did get it to work with this command jeggy@jeggy-XPS:~$ sudo dpkg --force-architecture -i GuitarPro6-rev9063.deb [sudo] password for jeggy: Selecting previously unselected package guitarpro6:i386. (Reading database ... 285729 files and directories currently installed.) Unpacking guitarpro6:i386 (from GuitarPro6-rev9063.deb) ... dpkg: dependency problems prevent configuration of guitarpro6:i386: guitarpro6:i386 depends on gksu. dpkg: error processing guitarpro6:i386 (--install): dependency problems - leaving unconfigured Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Errors were encountered while processing: guitarpro6:i386 And even after i get that error the program perfectly works fine and updating and adding PPA's to the system works great, but when I'm trying to install some other software i get this error: jeggy@jeggy-XPS:~$ sudo apt-get install elinks Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these: The following packages have unmet dependencies: elinks : Depends: libfsplib0 (>= 0.9) but it is not going to be installed Depends: liblua50 (>= 5.0.3) but it is not going to be installed Depends: liblualib50 (>= 5.0.3) but it is not going to be installed Depends: libtre5 but it is not going to be installed Depends: elinks-data (= 0.12~pre5-7ubuntu1) but it is not going to be installed guitarpro6:i386 : Depends: gksu:i386 but it is not going to be installed E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). And whenever i write "apt-get -f install" i get this jeggy@jeggy-XPS:~$ sudo apt-get -f install [sudo] password for jeggy: Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages were automatically installed and are no longer required: dconf-gsettings-backend:i386 python-levenshtein python-indicate libav-tools libstartup-notification0:i386 libxmuu1:i386 libavfilter-extra-2 libbabl-0.0-0 libgegl-0.0-0 libgconf2-4:i386 python-vobject libgtk-3-0:i386 libpam-cap:i386 python-utidylib libdconf0:i386 python-iniparse python-xmpp libpam-gnome-keyring:i386 libxcb-util0:i386 python-farstream Use 'apt-get autoremove' to remove them. The following packages will be REMOVED: guitarpro6:i386 0 upgraded, 0 newly installed, 1 to remove and 7 not upgraded. 1 not fully installed or removed. After this operation, 84,0 MB disk space will be freed. Do you want to continue [Y/n]? y (Reading database ... 286979 files and directories currently installed.) Removing guitarpro6:i386 ... dpkg: warning: while removing guitarpro6:i386, directory '/opt/GuitarPro6/updater' not empty so not removed. dpkg: warning: while removing guitarpro6:i386, directory '/opt/GuitarPro6/Data/Soundbanks' not empty so not removed. Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... And now Guitar Pro is deleted. How can i install Guitar Pro and still be able to install other software afterwards?

    Read the article

  • Restrict number of characters to be typed for af:autoSuggestBehavior

    - by Arunkumar Ramamoorthy
    When using AutoSuggestBehavior for a UI Component, the auto suggest list is displayed as soon as the user starts typing in the field. In this article, we will find how to restrict the autosuggest list to be displayed till the user types in couple of characters. This would be more useful in the low latency networks and also the autosuggest list is bigger. We could display a static message to let the user know that they need to type in more characters to get a list for picking a value from. Final output we would expect is like the below image Lets see how we can implement this. Assuming we have an input text for the users to enter the country name and an autosuggest behavior is added to it. <af:inputText label="Country" id="it1"> <af:autoSuggestBehavior /> </af:inputText> Also, assuming we have a VO (we'll name it as CountryView for this example), with a view criteria to filter out the VO based on the bind variable passed. Now, we would generate View Impl class from the java node (including bind variables) and then expose the setter method of the bind variable to client interface. In the View layer, we would create a tree binding for the VO and the method binding for the setter method of the bind variable exposed above, in the pagedef file As we've already added an input text and an autosuggestbehavior for the test, we would not need to build the suggested items for the autosuggest list.Let us add a method in the backing bean to return us List of select items to be bound to the autosuggest list. padding: 5px; background-color: #fbfbfb; min-height: 40px; width: 544px; height: 168px; overflow: auto;"> public List onSuggest(String searchTerm) { ArrayList<SelectItem> selectItems = new ArrayList<SelectItem>(); if(searchTerm.length()>1) { //get access to the binding context and binding container at runtime BindingContext bctx = BindingContext.getCurrent(); BindingContainer bindings = bctx.getCurrentBindingsEntry(); //set the bind variable value that is used to filter the View Object //query of the suggest list. The View Object instance has a View //Criteria assigned OperationBinding setVariable = (OperationBinding) bindings.get("setBind_CountryName"); setVariable.getParamsMap().put("value", searchTerm); setVariable.execute(); //the data in the suggest list is queried by a tree binding. JUCtrlHierBinding hierBinding = (JUCtrlHierBinding) bindings.get("CountryView1"); //re-query the list based on the new bind variable values hierBinding.executeQuery(); //The rangeSet, the list of queries entries, is of type //JUCtrlValueBndingRef. List<JUCtrlValueBindingRef> displayDataList = hierBinding.getRangeSet(); for (JUCtrlValueBindingRef displayData : displayDataList){ Row rw = displayData.getRow(); //populate the SelectItem list selectItems.add(new SelectItem( (String)rw.getAttribute("Name"), (String)rw.getAttribute("Name"))); } } else{ SelectItem a = new SelectItem("","Type in two or more characters..","",true); selectItems.add(a); } return selectItems; } So, what we are doing in the above method is, to check the length of the search term and if it is more than 1 (i.e 2 or more characters), the return the actual suggest list. Otherwise, create a read only select item new SelectItem("","Type in two or more characters..","",true); and add it to the list of suggested items to be displayed. The last parameter for the SelectItem (boolean) is to make it as readOnly, so that users would not be able to select this static message from the displayed list. Finally, bind this method to the input text's autosuggestbehavior's suggestedItems property. <af:inputText label="Country" id="it1"> <af:autoSuggestBehavior suggestedItems="#{AutoSuggestBean.onSuggest}"/> </af:inputText>

    Read the article

  • Ubuntu 12.04 LTS initramfs-tools dependency issue

    - by Mike
    I know this has been asked several times, but each issue and resolution seems different. I've tried almost everything I could think of, but I can't fix this. I have a VM (VMware I think) running 12.04.03 LTS which has stuck dependencies. The VM is on a rented host, running a live system so I don't want to break it (further). uname -a Linux support 3.5.0-36-generic #57~precise1-Ubuntu SMP Thu Jun 20 18:21:09 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux Some more: sudo apt-get update [sudo] password for tracker: Reading package lists... Done Building dependency tree Reading state information... Done You might want to run ‘apt-get -f install’ to correct these. The following packages have unmet dependencies. initramfs-tools : Depends: initramfs-tools-bin (< 0.99ubuntu13.1.1~) but 0.99ubuntu13.3 is installed E: Unmet dependencies. Try using -f. sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: initramfs-tools The following packages will be upgraded: initramfs-tools 1 upgraded, 0 newly installed, 0 to remove and 2 not upgraded. 2 not fully installed or removed. Need to get 0 B/50.3 kB of archives. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y dpkg: dependency problems prevent configuration of initramfs-tools: initramfs-tools depends on initramfs-tools-bin (<< 0.99ubuntu13.1.1~); however: Version of initramfs-tools-bin on system is 0.99ubuntu13.3. dpkg: error processing initramfs-tools (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates it's a follow-up error from a previous failure. dpkg: dependency problems prevent configuration of apparmor: apparmor depends on initramfs-tools; however: Package initramfs-tools is not configured yet. dpkg: error processing apparmor (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates it's a follow-up error from a previous failure. Errors were encountered while processing: initramfs-tools apparmor E: Sub-process /usr/bin/dpkg returned an error code (1) If I look at the policy behind initramfs-tools / bin I get: apt-cache policy initramfs-tools initramfs-tools: Installed: 0.99ubuntu13.1 Candidate: 0.99ubuntu13.3 Version table: 0.99ubuntu13.3 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise-updates/main amd64 Packages *** 0.99ubuntu13.1 0 100 /var/lib/dpkg/status 0.99ubuntu13 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise/main amd64 Packages apt-cache policy initramfs-tools-bin initramfs-tools-bin: Installed: 0.99ubuntu13.3 Candidate: 0.99ubuntu13.3 Version table: *** 0.99ubuntu13.3 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise-updates/main amd64 Packages 100 /var/lib/dpkg/status 0.99ubuntu13 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise/main amd64 Packages So the issue seems to be I have 0.99ubuntu13.3 for initramfs-tools-bin yet 0.99ubuntu13.1 for initramfs-tools, and can't upgrade to 0.99ubuntu13.3. I've performed apt-get clean/autoclean/install -f/upgrade -f many times but they won't resolve. I can think of only 2 other 'solutions': Edit the dpkg dependency list to trick it into doing the installation with a broken dependency. This seems very dodgy and it would be a last resort Downgrade both initramfs-tools and initramfs-tools-bin to 0.99ubuntu13 from the precise/main sources and hope that would get them in step. However I'm not sure if this will be possible, or whether it would introduce more issues. I'm not sure how this situation arise in the first place. /boot was 96% full; it's now 56% full (it's tiny - 64MB ... this is what I got from the hosting company). Can anyone offer advice please?

    Read the article

  • How to remove synaptic without installing all the unwanted packages?

    - by Jay
    I am trying to uninstall synaptic. I prefer using apt-get and other command line tools to manage my packages. So I do not need synaptic and the software manager. I'm trying to remove both of them using apt-get. Its a new box. Recently installed Linux Mint mate 15. After installation, the only thing I did was, sudo apt-get update and sudo apt-get dist-upgrade After that, I did this command for removing synaptic, sudo apt-get remove --purge synaptic But this gives me a very weird output, Reading package lists... Done Building dependency tree Reading state information... Done The following packages were automatically installed and are no longer required: apturl-kde icoutils kate-data katepart kde-runtime kde-runtime-data kdelibs-bin kdelibs5-data kdelibs5-plugins kdesudo kdoctools kubuntu-debug-installer libattica0.4 libdlrestrictions1 libkactivities-bin libkactivities-models1 libkactivities6 libkatepartinterfaces4 libkcmutils4 libkde3support4 libkdeclarative5 libkdecore5 libkdesu5 libkdeui5 libkdewebkit5 libkdnssd4 libkemoticons4 libkfile4 libkhtml5 libkidletime4 libkio5 libkjsapi4 libkjsembed4 libkmediaplayer4 libknewstuff3-4 libknotifyconfig4 libkntlm4 libkparts4 libkpty4 libkrosscore4 libktexteditor4 libkxmlrpcclient4 libnepomuk4 libnepomukcore4abi1 libnepomukquery4a libnepomukutils4 libntrack-qt4-1 libntrack0 libphonon4 libplasma3 libpolkit-qt-1-1 libpoppler-qt4-4 libqapt2 libqapt2-runtime libqca2 libqt4-qt3support libsolid4 libsoprano4 libstreamanalyzer0 libstreams0 libthreadweaver4 libvirtodbc0 nepomuk-core nepomuk-core-data ntrack-module-libnl-0 odbcinst odbcinst1debian2 oxygen-icon-theme phonon phonon-backend-gstreamer plasma-scriptengine-javascript qapt-batch shared-desktop-ontologies soprano-daemon virtuoso-minimal virtuoso-opensource-6.1-bin virtuoso-opensource-6.1-common Use 'apt-get autoremove' to remove them. The following extra packages will be installed: apturl-kde icoutils kate-data katepart kde-runtime kde-runtime-data kdelibs-bin kdelibs5-data kdelibs5-plugins kdesudo kdoctools kubuntu-debug-installer libattica0.4 libdlrestrictions1 libkactivities-bin libkactivities-models1 libkactivities6 libkatepartinterfaces4 libkcmutils4 libkde3support4 libkdeclarative5 libkdecore5 libkdesu5 libkdeui5 libkdewebkit5 libkdnssd4 libkemoticons4 libkfile4 libkhtml5 libkidletime4 libkio5 libkjsapi4 libkjsembed4 libkmediaplayer4 libknewstuff3-4 libknotifyconfig4 libkntlm4 libkparts4 libkpty4 libkrosscore4 libktexteditor4 libkxmlrpcclient4 libnepomuk4 libnepomukcore4abi1 libnepomukquery4a libnepomukutils4 libntrack-qt4-1 libntrack0 libphonon4 libplasma3 libpolkit-qt-1-1 libpoppler-qt4-4 libqapt2 libqapt2-runtime libqca2 libqt4-qt3support libsolid4 libsoprano4 libstreamanalyzer0 libstreams0 libthreadweaver4 libvirtodbc0 libxml2-utils nepomuk-core nepomuk-core-data ntrack-module-libnl-0 odbcinst odbcinst1debian2 oxygen-icon-theme phonon phonon-backend-gstreamer plasma-scriptengine-javascript qapt-batch shared-desktop-ontologies soprano-daemon virtuoso-minimal virtuoso-opensource-6.1-bin virtuoso-opensource-6.1-common Suggested packages: libterm-readline-gnu-perl libterm-readline-perl-perl djvulibre-bin finger hspell libqca2-plugin-cyrus-sasl libqca2-plugin-gnupg libqca2-plugin-ossl phonon-backend-vlc phonon-backend-xine phonon-backend-mplayer The following packages will be REMOVED: aptoncd* apturl* mintupdate* mintwelcome* synaptic* The following NEW packages will be installed: apturl-kde icoutils kate-data katepart kde-runtime kde-runtime-data kdelibs-bin kdelibs5-data kdelibs5-plugins kdesudo kdoctools kubuntu-debug-installer libattica0.4 libdlrestrictions1 libkactivities-bin libkactivities-models1 libkactivities6 libkatepartinterfaces4 libkcmutils4 libkde3support4 libkdeclarative5 libkdecore5 libkdesu5 libkdeui5 libkdewebkit5 libkdnssd4 libkemoticons4 libkfile4 libkhtml5 libkidletime4 libkio5 libkjsapi4 libkjsembed4 libkmediaplayer4 libknewstuff3-4 libknotifyconfig4 libkntlm4 libkparts4 libkpty4 libkrosscore4 libktexteditor4 libkxmlrpcclient4 libnepomuk4 libnepomukcore4abi1 libnepomukquery4a libnepomukutils4 libntrack-qt4-1 libntrack0 libphonon4 libplasma3 libpolkit-qt-1-1 libpoppler-qt4-4 libqapt2 libqapt2-runtime libqca2 libqt4-qt3support libsolid4 libsoprano4 libstreamanalyzer0 libstreams0 libthreadweaver4 libvirtodbc0 libxml2-utils nepomuk-core nepomuk-core-data ntrack-module-libnl-0 odbcinst odbcinst1debian2 oxygen-icon-theme phonon phonon-backend-gstreamer plasma-scriptengine-javascript qapt-batch shared-desktop-ontologies soprano-daemon virtuoso-minimal virtuoso-opensource-6.1-bin virtuoso-opensource-6.1-common 0 upgraded, 78 newly installed, 5 to remove and 0 not upgraded. Need to get 60.9 MB of archives. After this operation, 146 MB of additional disk space will be used. Do you want to continue [Y/n]? n Abort. As you can see, apt-get is trying to install the same packages that it is asking me to autoremove. Could someone please tell me, how to uninstall synaptic properly? Or am I missing something? Just for the record, I also did, sudo apt-get autoremove --purge like it asked me to ... and this is what I got, Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 6 not upgraded.

    Read the article

< Previous Page | 59 60 61 62 63 64 65 66 67 68 69 70  | Next Page >