Search Results

Search found 23568 results on 943 pages for 'select'.

Page 431/943 | < Previous Page | 427 428 429 430 431 432 433 434 435 436 437 438  | Next Page >

  • CSS Intelligent Merger

    - by BHare
    I am looking for a tool very similar to http://www.tothepc.com/archives/combine-merge-multiple-css-files/ However, given this example: test1.css: #admin { background: #c9d2dc; border-color: #ccc } test2.css: #admin { background: #222; border-bottom: 1px solid #444; border-left: 1px solid #444; padding: 2px; position: fixed; right: 0px; top: 0px; width: 120px; z-index: 2 } It will only allow you to select one or the other. I want to merge them, making it: #admin { background: #c9d2dc; border-color: #ccc border-bottom: 1px solid #444; border-left: 1px solid #444; padding: 2px; position: fixed; right: 0px; top: 0px; width: 120px; z-index: 2 }

    Read the article

  • In-page Google Analytics giving no page views recorded

    - by Nicolo77
    I am trying to use Google In-Page Analytics. The rest of Google Analytics seems to work correctly on my site, but when I go to the new In-page analytics, I get no click appearing. I just get an error saying "There are no pageviews recorded for this page. Try adjusting the date range or select an alternate page." To the left in the content details it tells the number of page views. Do I need to setup something special for In-Page anayltics to work?

    Read the article

  • SQLAuthority News – Windows Azure Training Kit Updated October 2012

    - by pinaldave
    Microsoft has recently released the updated to Windows Azure Training Kit. Earlier this month they have updated the kit and included quite a lot of things. Now the training kit contains 47 hands-on labs, 24 demos and 38 presentations. The best part is that the kit is now available to download in two different formats 1) Full Package (324.5 MB) and 2) Web Installer (2.4 MB). The full package enables you to download all of the hands-on labs and presentations to your local machine. The Web Installer allows you to select and download just the specific hands-on labs and presentations that you need. This Windows Azure Training Kit contains Hands on Labs, Presentations and Videos and Demos. I encourage all of you to try this out as well. The Kit also contains details about Samples and Tools. The training kit is the most authoritative learning resource on Windows Azure. You can download the Windows Azure Training Kit from here. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Azure, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Access Services in SharePoint Server 2010

    - by Wayne
    Another SharePoint Server 2010 feature which cannot go unnoticed is the Access Services. Access Services is a service in SharePoint Server 2010 that allows administrators to view, edit, and configure a Microsoft access application within a Web Browser. Access Services settings support backup and recovery, regardless of whether there is a UI setting in Central Administration. However, backup and recovery only apply to service-level and administrative-level settings; end-user content from the Access application is not backed up as part of this process. Access Services has Windows PowerShell functionality that can be used to provide the service that uses settings from a previous backup; configure and manage macro and query setting; manage and configure session management; and configure all the global settings of the service. Key Benefits of SharePoint Server Access Services Easier Access to right tools: The enhanced, customizable Ribbon in Access 2010 makes it easy to uncover more commands so you can focus on the end product. The new Microsoft Office BackstageTM view is yet another feature that can help you easily analyze and document your database, share, publish, and customize your Access 2010 experience, all from one convenient location. Helps build database effortlessly and quickly: Out-of-the box templates and reusable components make Access Services the fastest, simplest database solution available. It helps find new pre-built templates which you can start using without customization or select templates created by your peers in the Access online community and customize them to meet your needs. It builds your databases with new modular components. New Application Parts enable you to add a set of common Access components, such as a table and form for task management, to your database in a few simple clicks. Database navigation is now simplified. It creates Navigation Forms and makes your frequently used forms and reports more accessible without writing any code or logic. Create Impactful forms and reports: Whether it's an inventory of your assets or customer sales database, Access 2010 brings the innovative tools you'd expect from Microsoft Office. Access Services easily spot trends and add emphasis to your data. It quickly create coordinating database forms and reports and bring the Web into your database. Obtain a centralized landing pad for your data: Access 2010 offers easy ways to bring your data together and help increase work quality. New technologies help break down barriers so you can share and work together on your databases, making you or your team more efficient and productive. Add automation and complex expressions: If you need a more robust database design, such as preventing record deletion if a specific condition is met or if you need to create calculations to forecast your budget, Access 2010 empowers you to be your own developer. The enhanced Expression Builder greatly simplifies your expression building experience with IntelliSense®. With the revamped Macro Designer, it's now even easier for you to add basic logic to your database. New Data Macros allow you to attach logic to your data, centralizing the logic on the table, not the objects that update your data. Key features of Access Services 2010 - Access database content through a Web browser: Newly added Access Services on Microsoft SharePoint Server 2010 enables you to make your databases available on the Web with new Web databases. Users without an Access client can open Web forms and reports via a browser and changes are automatically synchronized. - Simplify how you access the features you need: The Ribbon, improved in Access 2010, helps you access commands even more quickly by enabling you to customize or create your own tabs. The new Microsoft Office Backstage view replaces the traditional File menu to provide one central, organized location for all of your document management tasks. - Codeless navigation: Use professional looking web-like navigation forms to make frequently used forms and reports more accessible without writing any code or logic. - Easily reuse Access items in other databases: Use Application Parts to add pre-built Access components for common tasks to your database in a few simple clicks. You can also package common database components, such as data entry forms and reports for task management, and reuse them across your organization or other databases. - Simplified formatting: By using Office themes you can create coordinating professional forms and reports across your database. Simply select a familiar and great looking Office theme, or design your own, and apply it to your database. Newly created Access objects will automatically match your chosen theme.

    Read the article

  • Only MKV format available in HandBrake

    - by Steve Ellis
    I am running Ubuntu 14.04 LTS 32 bit. I have installed HandBrake rev5474 (i686), which I believe is the latest, and the Ubuntu Restricted Extras. I am able to play DVDs via VLC but when it comes to ripping them, so that I can back them up to my Twonky media server, I have issues. I launch HandBrake and find that the only format available for me to select is MKV. When I used to run Handbrake on this machine while I was running Ubuntu 13.10 and lower I had no issues and **lots of formats (including MP4 which is what I'm really after) but since reformating and installing 14.04 I've had this issue. Any help would be much appreciated.

    Read the article

  • 2D isometric: screen to tile coordinates

    - by Dr_Asik
    I'm writing an isometric 2D game and I'm having difficulty figuring precisely on which tile the cursor is. Here's a drawing: where xs and ys are screen coordinates (pixels), xt and yt are tile coordinates, W and H are tile width and tile height in pixels, respectively. My notation for coordinates is (y, x) which may be confusing, sorry about that. The best I could figure out so far is this: int xtemp = xs / (W / 2); int ytemp = ys / (H / 2); int xt = (xs - ys) / 2; int yt = ytemp + xt; This seems almost correct but is giving me a very imprecise result, making it hard to select certain tiles, or sometimes it selects a tile next to the one I'm trying to click on. I don't understand why and I'd like if someone could help me understand the logic behind this. Thanks!

    Read the article

  • Salesforce deployment guideline using Sandbox

    - by ybbest
    Create Deployment connection Enable the inbound change set settings on the destination Environment you would like to deploy the solution to. Enable the outbound change set settings on the source Environment where you package your application. The best practice is to Package everything in the changeset and salesforce will only deploy the change into your destination environment. If you only package the change, you could miss some of the changes. You can clone the change set on the source destination however the initial packaging takes some time as you need to go through everything and select the components manually. After the change set is packaged, you need to upload the chagneset so that destination environment can see the change set in its incoming change set list. Click Validate the change set before deployment. References: Development Lifecycle Guide Change Sets Best Practices

    Read the article

  • English as a system language but Russian regional settings

    - by mbaitoff
    I usually choose English as an installation language since I believe that the original is better than the translation. However, the environment I'm working in is mostly Russian, so I have to deal with locale specificity. Even worse is the fact that selecting English yields to royal measurement system, that is, feet, inches, and damned letter paper size. Whatever I do, I didn't manage to get rid of letter paper size - eventually here and there I stumble upon letter as a hidden default, and that spoils my prints. How can I select and use English as my language, but use metric system everywhere and a4 paper size everywhere, and Russian regional settings (date, time, decimal etc).

    Read the article

  • Installation-Allocate drive space/Boot Loader

    - by user10134
    When I try to install Ubuntu 10.10 from the official livedisc I got in the mail, when I get to the "Allocate Disk Space" step I cannot get it to work. I shrank my win7 partition so I have unallocated space, then I tried using the space while it is formatted in NTFS, but the partitions will not show up in the box. /dev/sda is selected under boot loader, and I can't select anything else, but the partition box is blank so when I click "install ubuntu" it just says: "No root file system is defined. Please correct this from the partitioning menu." -I am trying to dual-boot win7 and ubuntu, but I was never asked in the install process whether I would like to install just ubuntu or dual-boot?

    Read the article

  • Intel HD graphics on Ubuntu

    - by tiax
    My girlfriend got a new Subnotebook for Christmas (Sony Vayo VPCY21S1E), which comes with an "Intel HD graphics" vga adaptor. When I try to boot the Ubuntu installer from USB, the screen goes blank after a short while, before I even see the Ubuntu logo. However, when I select "nomodeset" in the boot options, I can boot it to the CLI login prompt. When I start X, though, that only works in VESA mode (I've read Intel eventually got rid of Usermode Mode Setting and only offers KMS now, which I've disabled to get it to boot). What can I do to enable a) higher resolution than 1024x768 in VESA b) hardware acceleration for compiz, video playback, etc?

    Read the article

  • Broadcom STA driver doesn't work well with BCM4313

    - by Oli
    Following on from my other question about our new Samsung Q330, I've noticed that the wireless is incredibly flakey. It can connect but after a little use, especially if it does a lot of downloading at once (read: install something from the Software Centre), the connection stops working. Network Manager still see the connection, there's just no network throughput. I've simple tests like pinging other local network hosts and they just fail. The Samsung Q330 has a Broadcom BCM4313 wifi card (it's proper ID: 14e4:4727) and it's running on the Broadcom STA drivers that Jockey suggests (it didn't work at all without this). I did try installing b43-fwcutter but this just didn't do anything. I was expecting a configuration screen to come up (to select a firmware) but it never did. This page suggests the newer brcm80211 driver might be able to help, but I don't know how to install that. If you think this is the right route, please let me know how one goes about installing it.

    Read the article

  • Why does Ubuntu Download recommend 32-bit install?

    - by Warren Pena
    The Ubuntu desktop download screen has a pair of radio buttons you use to select whether you wish to download the 32-bit or 64-bit version. The 64-bit version is labeled "Not recommended for daily desktop usage." If you have a 64-bit processor, why would you not want to use the 64-bit version of Ubuntu? Update for 10.10: They've removed the "Not recommended" label from the 64-bit version and added a "Recommended" label to the 32-bit version. Update for 11.04: Same as 10.10. Update for 12.04: Still says "Recommended" next to 32-bit version of desktop

    Read the article

  • How to configure permanent autologin?

    - by DanielGibbs
    I'm trying to set up an Ubuntu 12.04 machine as a kiosk and I would like it, on boot, to automatically log in as my kiosk user "kiosk", and start the appropriate display manager, in this case blackbox. I have configured /etc/lightdm/lightdm.conf as in this question and have an appropriate /usr/share/xsessions/blackbox.desktop to launch blackbox. I managed to get initial autologin by using the dbus-send method in this question, however if I right-click and select "Exit" from blackbox, then I am taken back to the login screen. How can I configure lightdm/ubuntu to always autologin as "kiosk" instead of displaying the login screen? Or, failing that, how can I configure blackbox to not display a menu when I right-click?

    Read the article

  • Flattening a Jagged Array with LINQ

    - by PSteele
    Today I had to flatten a jagged array.  In my case, it was a string[][] and I needed to make sure every single string contained in that jagged array was set to something (non-null and non-empty).  LINQ made the flattening very easy.  In fact, I ended up making a generic version that I could use to flatten any type of jagged array (assuming it's a T[][]): private static IEnumerable<T> Flatten<T>(IEnumerable<T[]> data) { return from r in data from c in r select c; } Then, checking to make sure the data was valid, was easy: var flattened = Flatten(data); bool isValid = !flattened.Any(s => String.IsNullOrEmpty(s)); You could even use method grouping and reduce the validation to: bool isValid = !flattened.Any(String.IsNullOrEmpty); Technorati Tags: .NET,LINQ,Jagged Array

    Read the article

  • Desktop background won't change until restart

    - by Ben
    I'm new to Ubuntu and indeed Linux systems. I have 11.04 installed on my laptop. Here's the problem. When i select a picture for the desktop background, it says that Desktop Background has been changed but the changes do not apply right away. It is only after I have restarted the system that the changes will appear. This did not happen before. When i first started using this OS a few months ago the changes applied immediately. So what have i done that made this start acting wonky. Thank you for any help.

    Read the article

  • Cardinality Estimation Bug with Lookups in SQL Server 2008 onward

    - by Paul White
    Cost-based optimization stands or falls on the quality of cardinality estimates (expected row counts).  If the optimizer has incorrect information to start with, it is quite unlikely to produce good quality execution plans except by chance.  There are many ways we can provide good starting information to the optimizer, and even more ways for cardinality estimation to go wrong.  Good database people know this, and work hard to write optimizer-friendly queries with a schema and metadata (e.g. statistics) that reduce the chances of poor cardinality estimation producing a sub-optimal plan.  Today, I am going to look at a case where poor cardinality estimation is Microsoft’s fault, and not yours. SQL Server 2005 SELECT th.ProductID, th.TransactionID, th.TransactionDate FROM Production.TransactionHistory AS th WHERE th.ProductID = 1 AND th.TransactionDate BETWEEN '20030901' AND '20031231'; The query plan on SQL Server 2005 is as follows (if you are using a more recent version of AdventureWorks, you will need to change the year on the date range from 2003 to 2007): There is an Index Seek on ProductID = 1, followed by a Key Lookup to find the Transaction Date for each row, and finally a Filter to restrict the results to only those rows where Transaction Date falls in the range specified.  The cardinality estimate of 45 rows at the Index Seek is exactly correct.  The table is not very large, there are up-to-date statistics associated with the index, so this is as expected. The estimate for the Key Lookup is also exactly right.  Each lookup into the Clustered Index to find the Transaction Date is guaranteed to return exactly one row.  The plan shows that the Key Lookup is expected to be executed 45 times.  The estimate for the Inner Join output is also correct – 45 rows from the seek joining to one row each time, gives 45 rows as output. The Filter estimate is also very good: the optimizer estimates 16.9951 rows will match the specified range of transaction dates.  Eleven rows are produced by this query, but that small difference is quite normal and certainly nothing to worry about here.  All good so far. SQL Server 2008 onward The same query executed against an identical copy of AdventureWorks on SQL Server 2008 produces a different execution plan: The optimizer has pushed the Filter conditions seen in the 2005 plan down to the Key Lookup.  This is a good optimization – it makes sense to filter rows out as early as possible.  Unfortunately, it has made a bit of a mess of the cardinality estimates. The post-Filter estimate of 16.9951 rows seen in the 2005 plan has moved with the predicate on Transaction Date.  Instead of estimating one row, the plan now suggests that 16.9951 rows will be produced by each clustered index lookup – clearly not right!  This misinformation also confuses SQL Sentry Plan Explorer: Plan Explorer shows 765 rows expected from the Key Lookup (it multiplies a rounded estimate of 17 rows by 45 expected executions to give 765 rows total). Workarounds One workaround is to provide a covering non-clustered index (avoiding the lookup avoids the problem of course): CREATE INDEX nc1 ON Production.TransactionHistory (ProductID) INCLUDE (TransactionDate); With the Transaction Date filter applied as a residual predicate in the same operator as the seek, the estimate is again as expected: We could also force the use of the ultimate covering index (the clustered one): SELECT th.ProductID, th.TransactionID, th.TransactionDate FROM Production.TransactionHistory AS th WITH (INDEX(1)) WHERE th.ProductID = 1 AND th.TransactionDate BETWEEN '20030901' AND '20031231'; Summary Providing a covering non-clustered index for all possible queries is not always practical, and scanning the clustered index will rarely be optimal.  Nevertheless, these are the best workarounds we have today. In the meantime, watch out for poor cardinality estimates when a predicate is applied as part of a lookup. The worst thing is that the estimate after the lookup join in the 2008+ plans is wrong.  It’s not hopelessly wrong in this particular case (45 versus 16.9951 is not the end of the world) but it easily can be much worse, and there’s not much you can do about it.  Any decisions made by the optimizer after such a lookup could be based on very wrong information – which can only be bad news. If you think this situation should be improved, please vote for this Connect item. © 2012 Paul White – All Rights Reserved twitter: @SQL_Kiwi email: [email protected]

    Read the article

  • Can I find out the number of searches on a given keyword, per state?

    - by Philippe
    I know that Google tells you how many times a certain keyword is used in a search. You can use the Google Keyword Tool for that. This tool also allows you to find out the number of "local" searches: this is the number of times a person from a given country searches for this keyword. My questions: can you also find out how many searches originate from a given American state ? In the Keyword Tool, I can only select countries, not states. Any other systems I can use to determine where people are searching for a given keyword?

    Read the article

  • How to import in BIDS more than one SSIS package in one shot!

    - by Luca Zavarella
    Have you ever wanted to add more than one Integration Services existing package (e.g. 20 packages) in a SSIS project? Well, you may suppose that an Open Dialog supports multiple files selection to import more than one file at a time ... BIDS Open Dialog doesn’t allow this, you can just select a single file! Hence the loss of valuable time spent to import the packages one at a time. Few days ago I learned a trick that solves the problem, thanks to this post by Matt Masson. Just copy all the packages to import from Windows Explorer (Ctrl + C): Then just right click on the SSIS Packages folder of the Integration Services project and make a simple Past (CTRL + V): So “auto-magically” you’ll have all those packages imported in your Integration Services project!! What can I say... this feature was well hidden!

    Read the article

  • Detecting Duplicates Using Oracle Business Rules

    - by joeywong-Oracle
    Recently I was involved with a Business Process Management Proof of Concept (BPM PoC) where we wanted to show how customers could use Oracle Business Rules (OBR) to easily define some rules to detect certain conditions, such as duplicate account numbers, duplicate names, high transaction amounts, etc, in a set of transactions. Traditionally you would have to loop through the transactions and compare each transaction with each other to find matching conditions. This is not particularly nice as it relies on more traditional approaches (coding) and is not the most efficient way. OBR is a great place to house these types’ of rules as it allows users/developers to externalise the rules, in a simpler manner, externalising the rules from the message flows and allows users to change them when required. So I went ahead looking for some examples. After quite a bit of time spent Googling, I did not find much out in the blogosphere. In fact the best example was actually from...... wait for it...... Oracle Documentation! (http://docs.oracle.com/cd/E28271_01/user.1111/e10228/rules_start.htm#ASRUG228) However, if you followed the link there was not much explanation provided with the example. So the aim of this article is to provide a little more explanation to the example so that it can be better understood. Note: I won’t be covering the BPM parts in great detail. Use case: Payment instruction file is required to be processed. Before instruction file can be processed it needs to be approved by a business user. Before the approval process, it would be useful to run the payment instruction file through OBR to look for transactions of interest. The output of the OBR can then be used to flag the transactions for the approvers to investigate. Example BPM Process So let’s start defining the Business Rules Dictionary. For the input into our rules, we will be passing in an array of payments which contain some basic information for our demo purposes. Input to Business Rules And for our output we want to have an array of rule output messages. Note that the element I am using for the output is only for one rule message element and not an array. We will configure the Business Rules component later to return an array instead. Output from Business Rules Business Rule – Create Dictionary Fill in all the details and click OK. Open the Business Rules component and select Decision Functions from the side. Modify the Decision Function Configuration Select the decision function and click on the edit button (the pencil), don’t worry that JDeveloper indicates that there is an error with the decision function. Then click the Ouputs tab and make sure the checkbox under the List column is checked, this is to tell the Business Rules component that it should return an array of rule message elements. Updating the Decision Service Next we will define the actual rules. Click on Ruleset1 on the side and then the Create Rule in the IF/THEN Rule section. Creating new rule in ruleset Ok, this is where some detailed explanation is required. Remember that the input to this Business Rules dictionary is a list of payments, each of those payments were of the complex type PaymentType. Each of those payments in the Oracle Business Rules engine is treated as a fact in its working memory. Implemented rule So in the IF/THEN rule, the first task is to grab two PaymentType facts from the working memory and assign them to temporary variable names (payment1 and payment2 in our example). Matching facts Once we have them in the temporary variables, we can then start comparing them to each other. For our demonstration we want to find payments where the account numbers were the same but the account name was different. Suspicious payment instruction And to stop the rule from comparing the same facts to each other, over and over again, we have to include the last test. Stop rule from comparing endlessly And that’s it! No for loops, no need to keep track of what you have or have not compared, OBR handles all that for you because everything is done in its working memory. And once all the tests have been satisfied we need to assert a new fact for the output. Assert the output fact Save your Business Rules. Next step is to complete the data association in the BPM process. Pay extra care to use Copy List instead of the default Copy when doing data association at an array level. Input and output data association Deploy and test. Test data Rule matched Parting words: Ideally you would then use the output of the Business Rules component to then display/flag the transactions which triggered the rule so that the approver can investigate. Link: SOA Project Archive [Download]

    Read the article

  • HSSFS Part 3: SQL Saturday is Awesome! And DEFAULT_DOMAIN(), and how I found it

    - by Most Valuable Yak (Rob Volk)
    Just a quick post I should've done yesterday but I was recovering from SQL Saturday #48 in Columbia, SC, where I went to some really excellent sessions by some very smart experts.  If you have not yet attended a SQL Saturday, or its been more than 1 month since you last did, SIGN UP NOW! While searching the OBJECT_DEFINITION() of SQL Server system procedures I stumbled across the DEFAULT_DOMAIN() function in xp_grantlogin and xp_revokelogin.  I couldn't find any information on it in Books Online, and it's a very simple, self-explanatory function, but it could be useful if you work in a multi-domain environment.  It's also the kind of neat thing you can find by using this query: SELECT OBJECT_SCHEMA_NAME([object_id]) object_schema, name FROM sys.all_objects WHERE OBJECT_DEFINITION([object_id]) LIKE '%()%'  ORDER BY 1,2 I'll post some elaborations and enhancements to this query in a later post, but it will get you started exploring the functional SQL Server sea. UPDATE: I goofed earlier and said SQL Saturday #46 was in Columbia. It's actually SQL Saturday #48, and SQL Saturday #46 was in Raleigh, NC.

    Read the article

  • Can I remove "Free space" text from the Nautilus status bar?

    - by DisgruntledGoat
    I want to remove the part in Nautilus that shows free disk space. In theory it may useful but it's so badly designed, every time I select some files my eye gets drawn to "15 GB" instead of the actual size of the files. And with no files selected, it says something like "24 items, Free space 15GB" which at a quick glance looks like the total size of the files. I've looked through the preferences and don't see anything. I'm using Ubuntu 10.10 with Nautilus 2.32.0.

    Read the article

  • Is there something better than a StringBuilder for big blocks of SQL in the code

    - by Eduardo Molteni
    I'm just tired of making a big SQL statement, test it, and then paste the SQL into the code and adding all the sqlstmt.append(" at the beginning and the ") at the end. It's 2011, isn't there a better way the handle a big chunk of strings inside code? Please: don't suggest stored procedures or ORMs. edit Found the answer using XML literals and CData. Thanks to all the people that actually tried to answer the question without questioning me for not using ORM, SPs and using VB edit 2 the question leave me thinking that languages could try to make a better effort for using inline SQL with color syntax, etc. It will be cheaper that developing Linq2SQL. Just something like: dim sql = <sql> SELECT * ... </sql>

    Read the article

  • WCF Operations and Multidimensional Arrays

    - by JoshReuben
    You cant pass MultiD arrays accross the wire using WCF - you need to pass jagged arrays. heres 2 extension methods that will allow you to convert prior to serialzation and convert back after deserialization:         public static T[,] ToMultiD<T>(this T[][] jArray)         {             int i = jArray.Count();             int j = jArray.Select(x => x.Count()).Aggregate(0, (current, c) => (current > c) ? current : c);                         var mArray = new T[i, j];             for (int ii = 0; ii < i; ii++)             {                 for (int jj = 0; jj < j; jj++)                 {                     mArray[ii, jj] = jArray[ii][jj];                 }             }             return mArray;         }         public static T[][] ToJagged<T>(this T[,] mArray)         {             var cols = mArray.GetLength(0);             var rows = mArray.GetLength(1);             var jArray = new T[cols][];             for (int i = 0; i < cols; i++)             {                 jArray[i] = new T[rows];                 for (int j = 0; j < rows; j++)                 {                     jArray[i][j] = mArray[i, j];                 }             }             return jArray;         } enjoy!

    Read the article

  • Copy diagram with swimlanes in Enterprise Architect

    - by blomqvist
    This is a post mostly to myself, so that when I forget this I can come here and check it out :) (I forget about ctrl-b from time to time) When copying diagrams from Enterprise Architect you can use ctrl-B to include the swim-lanes. A normal ctrl-c will not give you the swim-lanes. A side effect is that ctrl—b gives you a copy of the whole diagram, you cannot select what objects to copy. So marking objects first or pressing ctrl-A to selects all is both unnecessary and meaningless since ctrl-b seems to mean: “copy the whole diagram now and include all the stuff in there, even the swimlanes” (which ctrl-A, ctrl-C does not).

    Read the article

  • How to reset settings when Unity won't finish booting?

    - by Emre
    I have a new 12.04 installation and I messed things up after trying to move /home to an NTFS partition, which I later learned was a bad idea. I removed references to the NTFS partition on fstab and created new users on the ext4 / partition. Now I can't get Unity to start up properly for any user. I get the GUI with only three Launch icons (none of which are clickable) and no bar at the top. The keyboard seems to the nonfunctional after I enter my credentials. The interesting thing is that I can boot when I go through recovery mode and select resume. I wonder whether I am creating the new users properly. What is the correct protocol for doing so in order to ensure that they can run Unity?

    Read the article

< Previous Page | 427 428 429 430 431 432 433 434 435 436 437 438  | Next Page >