Search Results

Search found 22992 results on 920 pages for 'custom pages'.

Page 194/920 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • Achieve Named Criteria with multiple tables in EJB Data control

    - by Deepak Siddappa
    In EJB create a named criteria using sparse xml and in named criteria wizard, only attributes related to the that particular entities will be displayed.  So here we can filter results only on particular entity bean. Take a scenario where we need to create Named Criteria based on multiple tables using EJB. In BC4J we can achieve this by creating view object based on multiple tables. So in this article, we will try to achieve named criteria based on multiple tables using EJB.Implementation StepsCreate Java EE Web Application with entity based on Departments and Employees, then create a session bean and data control for the session bean.Create a Java Bean, name as CustomBean and add below code to the file. Here in java bean from both Departments and Employees tables three fields are taken. public class CustomBean { private BigDecimal departmentId; private String departmentName; private BigDecimal locationId; private BigDecimal employeeId; private String firstName; private String lastName; public CustomBean() { super(); } public void setDepartmentId(BigDecimal departmentId) { this.departmentId = departmentId; } public BigDecimal getDepartmentId() { return departmentId; } public void setDepartmentName(String departmentName) { this.departmentName = departmentName; } public String getDepartmentName() { return departmentName; } public void setLocationId(BigDecimal locationId) { this.locationId = locationId; } public BigDecimal getLocationId() { return locationId; } public void setEmployeeId(BigDecimal employeeId) { this.employeeId = employeeId; } public BigDecimal getEmployeeId() { return employeeId; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getFirstName() { return firstName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getLastName() { return lastName; } } Open the sessionEJb file and add the below code to the session bean and expose the method in local/remote interface and generate a data control for that. Note:- Here in the below code "em" is a EntityManager. public List<CustomBean> getCustomBeanFindAll() { String queryString = "select d.department_id, d.department_name, d.location_id, e.employee_id, e.first_name, e.last_name from departments d, employees e\n" + "where e.department_id = d.department_id"; Query genericSearchQuery = em.createNativeQuery(queryString, "CustomQuery"); List resultList = genericSearchQuery.getResultList(); Iterator resultListIterator = resultList.iterator(); List<CustomBean> customList = new ArrayList(); while (resultListIterator.hasNext()) { Object col[] = (Object[])resultListIterator.next(); CustomBean custom = new CustomBean(); custom.setDepartmentId((BigDecimal)col[0]); custom.setDepartmentName((String)col[1]); custom.setLocationId((BigDecimal)col[2]); custom.setEmployeeId((BigDecimal)col[3]); custom.setFirstName((String)col[4]); custom.setLastName((String)col[5]); customList.add(custom); } return customList; } Open the DataControls.dcx file and create sparse xml for customBean. In sparse xml navigate to Named criteria tab -> Bind Variable section, create two binding variables deptId,fName. In sparse xml navigate to Named criteria tab ->Named criteria, create a named criteria and map the query attributes to the bind variables. In the ViewController create a file jspx page, from data control palette drop customBeanFindAll->Named Criteria->CustomBeanCriteria->Query as ADF Query Panel with Table. Run the jspx page and enter values in search form with departmentId as 50 and firstName as "M". Named criteria will filter the query of a data source and display the result like below.

    Read the article

  • How do I configure custom routes when an interface is configured?

    - by ManicDee
    Other Superuser questions have addressed the issue of adding custom routes to access e.g.: multiple networks of a corporate network through one interface, while accessing the Internet through another interface. So assuming that I have a script to add specific routes when en0 is configured, and a separate script to add specific routes when en1 is configured, is there some way I can trigger those scripts to run automatically when Mac OS X/Darwin starts and configures those interfaces? Back in my Linux days, it was possible to add an option in /etc/network/interfaces along the lines of: iface eth0 inet dhcp up /usr/local/sbin/eth0-routes-up Is there something similar for Mac OS X?

    Read the article

  • How to pass custom options to configure when building a package with debuild?

    - by TestUser16418
    Short background: I'm using Debian Sid. Currently the audacity package is conflicting with the pidgin package, because gstreamer0.10-plugins-bad are outdated. I'm trying to rebuild it, but one of the unit tests is failing as one plugin I don't need is causing a segfault. I need to disable these tests, and there's a configure option for that, but I don't know how to pass it. So, how can I run configure with custom options? Either by passing them to debuild, or by editing some file in the debian directory? I only worked with Gentoo ebuilds so far, which are extremely simple compared to the Debian control files, which I still find completely undecipherable.

    Read the article

  • How to create custom Live CD with upgraded linux-image kernel?

    - by ???
    I'm following this tutorial to customize a Live CD, http://www.debuntu.org/how-to-customize-your-ubuntu-live-cd However, it's failed to upgrade linux-image, I guess it's just can't update-grub. Any idea? EDIT I copied the custom/ directory into an ext4 image, now the linux-image-2.6.35-24 package is successfully upgrade. But as shown in the original CDROM, the vmlinuz-* and initrd.img-* file is moved out to /casper/vmlinuz and casper/initrd.lz. Well, I can move vmlinuz-2.6.34-24-generic to /casper/vmlinuz, but how to compress file initrd.img-2.6.35-24-generic in .lz format? (It looks like lzip, but it's not: ) # cd original-maverick/casper # lzip -d initrd.lz initrd.lz: bad magic number (file not in lzip format).

    Read the article

  • How can I create a custom OpenOffice / LibreOffice Writer table AutoFormat scheme?

    - by Merlyn Morgan-Graham
    None of the basic table AutoFormat schemes in LibreOffice Writer have both an alternation style defined and no sum column/row style defined. If they have alternation, they always seem to have sums. Because of this I'd like to define my own table scheme. What is the easiest way to accomplish this? A WYSIWYG isn't totally necessary. I am not scared of editing simple XML files as long as I have examples to work from, and if I don't have to edit base install files. If I can place them in a custom area or my user profile directory then that would be best. If there is a way to get the GUI Add functionality to properly recognize an alternation then that would also be helpful.

    Read the article

  • How can I register a custom protocol with xdg ?

    - by julien
    I've been struggling this morning trying to associate an application with a custom protocol, namely emacsclient and org-protocol. I'm calling this protocol from a webbrowser bookmarklet, and I get the following behaviour : In chromium, the "Launch Application" dialog comes up, and calls xdg-open org-protocol://... which ends up firing a new chromium frame. In firefox, I've tried setting network.protocol-handler.app.org-protocol to an empty string or my emacsclient path, anyhow I get the following error message : "Firefox doesn't know how to open this address, because the protocol (org-protocol) isn't associated with any program" without even showing any external application selection dialog. I'm not using any desktop environment, so I need to make this work strictly with xdg, however, despite reading the shared mime info spec etc, I still can't fathom a working configuration.

    Read the article

  • Where is the read/unread field in Outlook 2010 custom search?

    - by Ken
    I am trying to create a custom search folder in outlook 2010. I am using the "advanced" tab in the "search folder criteria" dialog. One of the criteria I need is the read/unread status of an message. But the "field" dropdown does not contain a field corresponding to read/unread status (see screen shot below). This is odd because the read/unread status is available in the "More Choices" tab, but seemingly not in the "advanced" tab. How do I create an advanced search folder criteria which incorporates the read/unread status of a message?

    Read the article

  • How create partitions with custom layout in VM on Xen Server?

    - by Valter Silva
    We use XenServer with RAID 0, as one disk only (I wonder if this is a good approach, if not what would be ?), then we install Xen. After that we put the centOS 6.4 x64 DVD in the server and try to install it. But when it should came the option to create a custom layout of the partitions, it don't. So when we choose the 'use entire disk', the OS start to be installed, and then create the default layout. With /home bigger than / which we don't want to do that. I wonder if this is a problem with : raid or xenserver or xenclient or centos ?

    Read the article

  • How can I register a custom protocol with xdg?

    - by julien
    I've been struggling this morning trying to associate an application with a custom protocol, namely emacsclient and org-protocol. I'm calling this protocol from a webbrowser bookmarklet, and I get the following behaviour : In chromium, the "Launch Application" dialog comes up, and calls xdg-open org-protocol://... which ends up firing a new chromium frame. In firefox, I've tried setting network.protocol-handler.app.org-protocol to an empty string or my emacsclient path, anyhow I get the following error message : "Firefox doesn't know how to open this address, because the protocol (org-protocol) isn't associated with any program" without even showing any external application selection dialog. I'm not using any desktop environment, so I need to make this work strictly with xdg, however, despite reading the shared mime info spec etc, I still can't fathom a working configuration.

    Read the article

  • Change cell formatting without VBA. Custom formatting?

    - by Sux2Lose
    I have a dropdown field with two values in a form I am creating. The expected data in cell D15 changes depending on the selection. If Option A is selected then a dollar amount is expected. If option B is selected that a percentage is expected. I would like the cell to be formatted as 'accounting' w/ zero decimals for option A and 'percentage' with zero decimals if option B is selected. I do not want to use VBA, if possible. I'm hoping there is a custom formatting solution.

    Read the article

  • Custom Transport Agent: How do I collect NDRs and all other undeliverables in Exchange 2010 from the Postmaster?

    - by makerofthings7
    I'm trying to collect all NDRs in a single mailbox for all invalid recipients, and anything that fails for any reason. I have a custom transport agent, that I've written myself that appears here: [PS] C:\Windows\system32>Get-TransportAgent Identity Enabled Priority -------- ------- -------- Transport Rule Agent True 1 Text Messaging Routing Agent True 2 Text Messaging Delivery Agent True 3 Routing Rule Agent True 4 **** Sometimes when I run get-messagetrackinglog I get failures like this below RunspaceId : 4ecc61fb-13b9-4506-b680-577222c9bf21 Timestamp : 10/14/2013 12:42:42 PM ClientIp : ClientHostname : Exchange1 ServerIp : ServerHostname : SourceContext : Routing Rule Agent ConnectorId : Source : AGENT EventId : FAIL InternalMessageId : 4416 MessageId : <[email protected]> Recipients : {[email protected]} RecipientStatus : {} TotalBytes : 4542 RecipientCount : 1 RelatedRecipientAddress : Reference : MessageSubject : review CGRC due diligence. Sender : [email protected] ReturnPath : [email protected] MessageInfo : MessageLatency : MessageLatencyType : None EventData : How can I collect the NDRs in a single mailbox for review? I have already set the following command but it is of no effect [PS] C:\>set-TransportConfig -JournalingReportNdrTo [email protected] -ExternalPostmasterAddress [email protected]

    Read the article

  • Custom CSV (.csv) filter for OpenOffice.org or LibreOffice?

    - by anon
    Is it possible to create a some kind of 'custom CSV filter' for OpenOffice.org or LibreOffice spreadsheet program. What I need is to have the program to use predefined CSV settings for loading and saving when I open, let's say file named 'somefile.myext'. Also I would need the loaded data to be placed in a prestyled spreadsheet. In this particular case, I would need the CSV settings to have tab as a field delimiter and no text delimiter at all. Prestyled spreadsheet would contain Blue gray coloring for every odd row (achieved with conditional formatting formula), some font styling and probably some column width definitions.

    Read the article

  • How to 'move' mail from Inbox to custom mail folder? (Mac Mail client + Gmail account)

    - by user27779
    I'm using Mail on Mac OS X with Gmail account. I can make a new mailbox folder, and mails can be dragged into there. But, the mails still remain in Inbox. It did not moved. It just copied. (well, it's 'label' feature of Gmail, I know.) This makes crazy. I can't separate mails for each purpose. Is there any way to make Mail application 'move' mails into my custom box (remove from Inbox)?

    Read the article

  • How do I use a custom 503 error page with Nginx?

    - by Michael Gorsuch
    Hi. I have implemented rate limiting with Nginx (which works excellently, by the way) and would like to display a custom 503 error page. I have followed examples on the web without luck. I am running a simple configuration that looks something like this: listen x.x.x.x:80 server_name something.com root /usr/local/www/something.com; error_page 503 /503.html; location / { limit_req zone=default burst=5 nodelay; proxy_pass http://mybackend; } The idea is that our rate limited users would be shown a special page explaining what was going on. The rate limiting is working, but the built-in 503 page is rendering. Any ideas?

    Read the article

  • Compression for custom mime type works on IIS 7.5, but not IIS 7.0?

    - by Doogal
    I've managed to set up compression for a custom mime type on IIS 7.5 with no problem. Add the mime type to IIS then add it to the httpCompression element in applicationHost.config. But when I do the same thing on IIS 7, the particular mime type is never compressed. This isn't a problem with compression in general since other mime types are compressed correctly. As far as I can tell, IIS 7 and IIS7.5 are configured in exactly the same way. Does IIS7 behave differently and do I need to do something else to get it working? I've setup failed request tracing and get a NO_MATCHING_CONTENT_TYPE error during compression but I can't figure out what else I need to do to tell IIS about my mime type

    Read the article

  • WordPress SEO Plugins to make your Blog Search Engine Friendly

    - by Vaibhav
    WordPress is the most common blogging system in use today and its use as a CMS is also wide spread. With hundreds of millions of sites using wordpress, getting correct SEO for your WordPress based Blog or Site is very important. We get regular queries from people who want Search Engine Optimisation for their site or blog which is made using wordpress. Here is a list of 16 of the best WordPress Plug-ins That can help you achieve better rankings: All in one SEO Pack This is most popular plugin among all SEO plugins for WordPress. It is easy to use and is compatible with most of the WordPress plugins. It works as a complete package of SEO plugin – automatically generating META tags and optimizing search engines for your titles and avoiding duplicate content. You can also include META tags manually (Met title, Meta description and Met keywords) for all pages and post in your website. HeadSpace2 HeasSpace2 is available in different languages , you can manage a wide range of SEO Tasks related with meta data, you can tag your posts, Custom descriptions and titles. So your page can rank the created relevancy on Search engines and you can load different settings for different pages. Platinum SEO plugin Automatic 301 redirects permalink changes, META tags generation, avoids duplicate content, and does SEO optimization of post and page titles and a lots of other features. TGFI.net SEO WordPress Plugin It’s a modified version of all-in-one SEO Pack. It has some unique feature over All-in-one SEO plugin, It generate titles, meta descriptions and meta keywords automatically when overrides are not present. Google XML Sitemaps Sitemaps Generated by this tool are supported by  Google,  Yahoo,  Bing, and Ask. We all know Sitemaps make indexing of web pages easier for web crawlers. Crawlers can retrieve complete structure of site and more information by sitemaps. They notify all major search engines about new posts every time you create a new post. Sitemap Generator You can generate highly customizable sitemap for your WordPress page. You can choose what to show and what not to show, you can list the items in your choice of orde. It supports pages and permalinks and multi-level categories. SEO Slugs They can generate more search engine friendly URLs for your site. Slugs are filename assigned to your post , this plugin removes all  common words like ‘a’, ‘the’, ‘in’, ‘what’, ‘you’ from slug which are assigned automatically to your post. SEO Post Links This is a similar plugin to SEO Slug, it removes unnecessary keywords from slug to make it short and SEO friendly and you can fix the number of characters in your post. Automatic SEO links With this tool you can create auto linking in your post. You can use this tool for inter linking or external linking too. Just select your words, anchor text target URL nature of links ( Do fallow / No follow ). This plugin will replace the matches found in post, WP Backlinks A helpful plugin for link exchange , whenever any webmaster submits a link for link exchange, the plugin will spider webmasters site for reciprocal link, and if everything is found good , your link will be exchanged. SEO Title Tag You can optimize your Title  tags of  Word press blog through this plugin . You can also override the title tag with custom titles , mass editing and title tags for 404 pages which are the main feature of this plugin. 404 SEO plugin With this Plugin you can customize 404 page of your site; you can give customized error message and links to relevant pages of your site. Redirection A powerful plugins to manage 301 redirection and logs related with redirection, with this plugin you can track 404 errors and track the log of all redirected URLs , this plugin can redirect  post automatically when URL changes for that post. AddToAny This plugin helps your readers to share, save, email and bookmark your posts and pages. It supports more than a hundred social bookmarking , networking and sharing sites. SEO Friendly Images You can make SEO friendly images available on your site with the help of this tool. It updates images with proper titles and ALT tags. Robots Meta A plugin which prevents Search engines to index comments on your post, login and admin pages. It also allows to add tags for individual pages.

    Read the article

  • Integrating Windows Form Click Once Application into SharePoint 2007 &ndash; Part 2 of 4

    - by Kelly Jones
    In my last post, I explained why we decided to use a Click Once application to solve our business problem. To quickly review, we needed a way for our business users to upload documents to a SharePoint 2007 document library in mass, set the meta data, set the permissions per document, and to do so easily. Let’s look at the pieces that make up our solution.  First, we have the Windows Form application.  This app is deployed using Click Once and calls SharePoint web services in order to upload files and then calls web services to set the meta data (SharePoint columns and permissions).  Second, we have a custom action.  The custom action is responsible for providing our users a link that will launch the Windows app, as well as passing values to it via the query string.  And lastly, we have the web services that the Windows Form application calls.  For our solution, we used both out of the box web services and a custom web service in order to set the column values in the document library as well as the permissions on the documents. Now, let’s look at the technical details of each of these pieces.  (All of the code is downloadable from here: )   Windows Form application deployed via Click Once The Windows Form application, called “Custom Upload”, has just a few classes in it: Custom Upload -- the form FileList.xsd -- the dataset used to track the names of the files and their meta data values SharePointUpload -- this class handles uploading the file SharePointUpload uses an HttpWebRequest to transfer the file to the web server. We had to change this code from a WebClient object to the HttpWebRequest object, because we needed to be able to set the time out value.  public bool UploadDocument(string localFilename, string remoteFilename) { bool result = true; //Need to use an HttpWebRequest object instead of a WebClient object // so we can set the timeout (WebClient doesn't allow you to set the timeout!) HttpWebRequest req = (HttpWebRequest)WebRequest.Create(remoteFilename); try { req.Method = "PUT"; req.Timeout = 60 * 1000; //convert seconds to milliseconds req.AllowWriteStreamBuffering = true; req.Credentials = System.Net.CredentialCache.DefaultCredentials; req.SendChunked = false; req.KeepAlive = true; Stream reqStream = req.GetRequestStream(); FileStream rdr = new FileStream(localFilename, FileMode.Open, FileAccess.Read); byte[] inData = new byte[4096]; int bytesRead = rdr.Read(inData, 0, inData.Length); while (bytesRead > 0) { reqStream.Write(inData, 0, bytesRead); bytesRead = rdr.Read(inData, 0, inData.Length); } reqStream.Close(); rdr.Close(); System.Net.HttpWebResponse response = (HttpWebResponse)req.GetResponse(); if (response.StatusCode != HttpStatusCode.OK && response.StatusCode != HttpStatusCode.Created) { String msg = String.Format("An error occurred while uploading this file: {0}\n\nError response code: {1}", System.IO.Path.GetFileName(localFilename), response.StatusCode.ToString()); LogWarning(msg, "2ACFFCCA-59BA-40c8-A9AB-05FA3331D223"); result = false; } } catch (Exception ex) { LogException(ex, "{E9D62A93-D298-470d-A6BA-19AAB237978A}"); result = false; } return result; } The class also contains the LogException() and LogWarning() methods. When the application is launched, it parses the query string for some initial values.  The query string looks like this: string queryString = "Srv=clickonce&Sec=N&Doc=DMI&SiteName=&Speed=128000&Max=50"; This Srv is the path to the server (my Virtual Machine is name “clickonce”), the Sec is short for security – meaning HTTPS or HTTP, the Doc is the shortcut for which document library to use, and SiteName is the name of the SharePoint site.  Speed is used to calculate an estimate for download speed for each file.  We added this so our users uploading documents would realize how long it might take for clients in remote locations (using slow WAN connections) to download the documents. The last value, Max, is the maximum size that the SharePoint site will allow documents to be.  This allowed us to give users a warning that a file is too large before we even attempt to upload it. Another critical piece is the meta data collection.  We organized our site using SharePoint content types, so when the app loads, it gets a list of the document library’s content types.  The user then select one of the content types from the drop down list, and then we query SharePoint to get a list of the fields that make up that content type.  We used both an out of the box web service, and one that we custom built, in order to get these values. Once we have the content type fields, we then add controls to the form.  Which type of control we add depends on the data type of the field.  (DateTime pickers for date/time fields, etc)  We didn’t write code to cover every data type, since we were working with a limited set of content types and field data types. Here’s a screen shot of the Form, before and after someone has selected the content types and our code has added the custom controls:     The other piece of meta data we collect is the in the upper right corner of the app, “Users with access”.  This box lists the different SharePoint Groups that we have set up and by checking the boxes, the user can set the permissions on the uploaded documents. All of this meta data is collected and submitted to our custom web service, which then sets the values on the documents on the list.  We’ll look at these web services in a future post. In the next post, we’ll walk through the Custom Action we built.

    Read the article

  • Reading the tea leaves from Windows Azure support

    - by jamiet
    A few idle thoughts… Three months ago I had an issue regarding Windows Azure where I was unable to login to the management portal. At the time I contacted Azure support, the issue was soon resolved and I thought no more about it. Until today that is when I received an email from Azure support providing a detailed analysis of the root cause, the fix and moreover precise details about when and where things occurred. The email itself is interesting and I have included the entirety of it below. A few things were interesting to me: The level of detail and the diligence in investigating and reporting the issue I found really rather impressive. They even outline the number of users that were affected (127 in case you can’t be bothered reading). Compare this to the quite pathetic support that another division within Microsoft, Skype, provided to Greg Low recently: Skype support and dead parrot sketches   This line: “Windows Azure performed a planned change from using the Microsoft account service (formerly Windows Live ID) to the Azure Active Directory (AAD) as its primary authentication mechanism on August 24th. This change was made to enable future innovation in the area of authentication – particularly for organizationally owned identities, identity federation, stronger authentication methods and compliance certification. ” I also found to be particularly interesting. I have long thought that one of the reasons Microsoft has proved to be such a money-making machine in the enterprise is because they provide the infrastructure and then upsell on top of that – and nothing is more infrastructural than Active Directory. It has struck me of late that they are trying to make the same play of late in the cloud by tying all their services into Azure Active Directory and here we see a clear indication of that by making AAD the authentication mechanism for anyone using Windows Azure. I get the feeling that we’re going to hear much much more about AAD in the future; isn’t it about time we could log on to SQL Azure Windows Azure SQL Database without resorting to SQL authentication, for example? And why do Microsoft have two identity providers – Microsoft Account (aka Windows Live ID) and AAD – isn’t it about time those things were combined? As I said, just some idle thoughts. Below is the transcript of the email if you are interested. @Jamiet  This is regarding the support request <redacted> where in you were not able to login into the windows azure management portal with live id. We are providing you with the summary, root cause analysis and information about permanent fix: Incident Title: You were unable to access Windows Azure Portal after Microsoft Account to Azure Active Directory account Migration. Service Impacted: Management Portal Incident Start Date and Time: 8/24/2012 4:30:00 PM Date and Time Service was Restored: 10/17/2012 12:00:00 AM Summary: Windows Azure performed a planned change from using the Microsoft account service (formerly Windows Live ID) to the Azure Active Directory (AAD) as its primary authentication mechanism on August 24th.   This change was made to enable future innovation in the area of authentication – particularly for organizationally owned identities, identity federation, stronger authentication methods and compliance certification.   While this migration was largely transparent to Windows Azure users, a small number of users whose sign-in names were part of a Windows Live Custom Domain were unable to login.   This incompatibility was not discovered during the Quality Assurance testing phase prior to the migration. Customer Impact: Customers whose sign-in names were part of a Windows Live Custom Domain were unable to sign-in the Management Portal after ~4:00 p.m. PST on August 24th, 2012.   We determined that the issue did impact at least 127 users in 98 of these Windows Live Custom Domains and had a maximum potential impact of 1,110 users in total. Root Cause: The root cause of the issue was an incompatibility in the AAD authentication service to handle logins from Microsoft accounts whose sign-in names were part of a Windows Live Custom Domains.  This issue was not discovered during the Quality Assurance testing phase prior to the migration from Microsoft Account (MSA) to AAD. Mitigations: The issue was mitigated for the majority of affected users by 8:20 a.m. PST on August 25th, 2012 by running some internal scripts to correct many known Windows Live Custom Domains.   The remaining affected domains fell into two categories: Windows Live Custom Domains that were not corrected by 8/25/2012. An additional 48 Windows Live Custom Domains were fixed in the weeks following the incident within 2 business days after the AAD team received an escalation from product support regarding those accounts. Windows Live Custom domains that were also provisioned in Office365. Some of the affected Windows Live Custom Domains had already been provisioned in AAD because their owners signed up for Office365 which is a service that also uses AAD.   In these cases the Azure customers had to work around the issue by renaming their Microsoft Account or using a different Microsoft Account to administer their Azure subscription. Permanent Fix: The Azure Active Directory team permanently fixed the issue for all customers on 10/17/2012 in an upgraded release of the AAD service.

    Read the article

  • How to "drag and drop" folders or multiple HTML files into a browser and have them open in multiple tabs

    - by PoorLuzer
    I save pages that I browse on the net and find interesting into a folder called C:\PageSaves Later, during the commute, I open these pages to see what they are and move them into a neatly categorized folder tree. For example, Perl related pages goto C:\Pages\Perl, MySQL related pages goto C:\Pages\MySQL and so on. I was wondering if there is any way I could open any number of HTML files on disc / inside a folder (C:\PageSaves in my case) into Mozilla/FF/K-Meleon etc For example, I would like to just "drag and drop" the folder C:\PageSaves into FireFox and have it open all the .html pages in the folder in a separate tab Right now, if I "drag and drop" multiple HTML files, it just opens the last file in the selection. Have a set of toolbar buttons, basically, a (the) plugin that should allow me to nuke the page (if I don't want to keep the page anymore) from disc or move the file (and its corresponding folder) into a predefined / new folder I am familiar with coding full blown FireFox plugins, so even if something very basic/almost similar exists, I can take it forward. Hints/clues/other methods of achieving the same result are all welcome!

    Read the article

  • Site Security/Access management for asp.net mvc application

    - by minal
    I am trying to find a good pattern to use for user access validation. Basically on a webforms application I had a framework which used user roles to define access, ie, users were assigned into roles, and "pages" were granted access to a page. I had a table in the database with all the pages listed in it. Pages could have child pages that got their access inherited from the parent. When defining access, I assigned the roles access to the pages. Users in the role then had access to the pages. It is fairly simple to manage as well. The way I implemented this was on a base class that every page inherited. On pageload/init I would check the page url and validate access and act appropriately. However I am now working on a MVC application and need to implement something similar, however I can't find a good way to make my previous solution work. Purely because I don't have static pages as url paths. Also I am not sure how best to approach this as I now have controllers rather then aspx pages. I have looked at the MVCSitemapprovider, but that does not work off a database, it needs a sitemap file. I need control of changing user persmissions on the fly. Any thoughts/suggestions/pointers would be greatly appreciated.

    Read the article

  • How to create a Facebook-App style notifications custom table?

    - by Tim Büthe
    I want to add a custom table to my iPhone app, that should look and work like the one used in the facebook app showing the notifications. It should contain rows with links in it. The text should be black, while the tap-able parts should appear blue. As a already figured out, labels only have one font, color and so on and you can't mix styles within it, I would have to use different components. The tab-able text parts maybe UIButtons with custom style without any border. I tried different approches to build the custom cell. I used Interface Builder, to layout components, but since the different parts have variable lengths they overlaid each other. I also tried to add instances of UIButton and UILabel as a subview to the cell's contentView as descibed in Apple's tutorial. The problem with this is that: The position and size of the components is fix and set when they get created using "initWithFrame:...". My code in "tableView:cellForRowAtIndexPath:" looks like this: UIButton *usernameButton; UILabel *mainLabel, *secondLabel; UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier]; if (cell == nil) { // cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:CellIdentifier] autorelease]; cell = [[[UITableViewCell alloc] initWithFrame:CGRectMake(0.0, 0.0, 220.0, 60.0) reuseIdentifier:CellIdentifier] autorelease]; [cell setBackgroundColor: [UIColor yellowColor]]; usernameButton = [[[UIButton alloc] initWithFrame:CGRectMake(0.0, 0.0, 220.0, 15.0)] autorelease]; usernameButton.tag = USERNAME_BUTTON_TAG; // ... format, font etc. usernameButton.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleHeight; [cell.contentView addSubview:usernameButton]; mainLabel = [[[UILabel alloc] initWithFrame:CGRectMake(0.0, 20.0, 220.0, 15.0)] autorelease]; mainLabel.tag = MAINLABEL_TAG; // ... format, font etc. mainLabel.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleHeight; [cell.contentView addSubview:mainLabel]; secondLabel = [[[UILabel alloc] initWithFrame:CGRectMake(0.0, 40.0, 220.0, 15.0)] autorelease]; secondLabel.tag = SECONDLABEL_TAG; // ... format, font etc. secondLabel.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleHeight; [cell.contentView addSubview:secondLabel]; } else { usernameButton = (UIButton *) [cell.contentView viewWithTag:USERNAME_BUTTON_TAG]; mainLabel = (UILabel *)[cell.contentView viewWithTag:MAINLABEL_TAG]; secondLabel = (UILabel *)[cell.contentView viewWithTag:SECONDLABEL_TAG]; } usernameButton.titleLabel.text = @"Peter"; mainLabel.text = @"sent you a message: ..."; secondLabel.text = @"11 minutes ago"; return cell; In a nutshell, I want to add the labels and buttons without a fix position or size, the buttons clickable and the cell height should automatically set according to the content. Here is an example of the cell's content: Peter sent you a message: "Hello! What are you doing?" Peter should be click/tab-able and his message, which has a variable length, should be italic and wrapped if necessary.

    Read the article

  • Calling a datatrigger for a radio button inside a datagrid

    - by Farax
    I have a datagrid with one column having a radio button. I want to set the GroupName when a certain condition is reached. Below is the code <Custom:DataGrid.Columns> <!-- ONLY ENABLED WHEN THE ITEM TYPE IS SINGLESELECT OR SINGLESELECT WITH ADDIOTIONAL DATA--> <Custom:DataGridTemplateColumn CanUserResize="False" MinWidth="20" > <Custom:DataGridTemplateColumn.CellTemplate> <DataTemplate> <RadioButton IsChecked="{Binding IsChecked}" d:DesignWidth="16" d:DesignHeight="16" GroupName="SingleChoiceSelection" Template="{DynamicResource RadioButtonTemplate}" Background="{DynamicResource BackgroundNew}" BorderBrush="#FF7A7171" Foreground="#FF6C6C6C" Margin="0" /> </DataTemplate> </Custom:DataGridTemplateColumn.CellTemplate> </Custom:DataGridTemplateColumn> <Custom:DataGridTextColumn Header="Choices" Binding="{Binding ChoiceText}" CellStyle="{DynamicResource DataGridCellStyle2}" MinWidth="150" /> </Custom:DataGrid.Columns> </Custom:DataGrid> The ItemSource contains a property called isChecked and I want to change the foreground color when isChecked is changed to true. How do i do this with a datatrigger?

    Read the article

  • entity framework navigation property further filter without loading into memory

    - by cellik
    Hi, I've two entities with 1 to N relation in between. Let's say Books and Pages. Book has a navigation property as Pages. Book has BookId as an identifier and Page has an auto generated id and a scalar property named PageNo. LazyLoading is set to true. I've generated this using VS2010 & .net 4.0 and created a database from that. In the partial class of Book, I need a GetPage function like below public Page GetPage(int PageNumber) { //checking whether it exist etc are not included for simplicity return Pages.Where(p=>p.PageNo==PageNumber).First(); } This works. However, since Pages property in the Book is an EntityCollection it has to load all Pages of a book in memory in order to get the one page (this slows down the app when this function is hit for the first time for a given book). i.e. Framework does not merge the queries and run them at once. It loads the Pages in memory and then uses LINQ to objects to do the second part To overcome this I've changed the code as follows public Page GetPage(int PageNumber) { MyContainer container = new MyContainer(); return container.Pages.Where(p=>p.PageNo==PageNumber && p.Book.BookId==BookId).First(); } This works considerably faster however it doesn't take into account the pages that have not been serialized to the db. So, both options has its cons. Is there any trick in the framework to overcome this situation. This must be a common scenario where you don't want all of the objects of a Navigation property loaded in memory when you don't need them.

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >