Search Results

Search found 26808 results on 1073 pages for 'storing information'.

Page 79/1073 | < Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >

  • Examining ASP.NET's Membership, Roles, and Profile - Part 18

    Membership, in a nutshell, is a framework build into the .NET Framework that supports creating, authenticating, deleting, and modifying user account information. Each user account has a set of core properties: username, password, email, a security question and answer, whether or not the account has been approved, whether or not the user is locked out of the system, and so on. These user-specific properties are certainly helpful, but they're hardly exhaustive - it's not uncommon for an application to need to track additional user-specific properties. For example, an online messageboard site might want to also also associate a signature, homepage URL, and IM address with each user account. There are two ways to associate additional information with user accounts when using the Membership model. The first - which affords the greatest flexibility, but requires the most upfront effort - is to create a custom data store for this information. If you are using the SqlMembershipProvider, this would mean creating an additional database table that had as a primary key the UserId value from the aspnet_Users table and columns for each of the additional user properties. The second option is to use the Profile system, which allows additional user-specific properties to be defined in a configuration file. (See Part 6 for an in-depth look at the Profile system.) This article explores how to store additional user information in a separate database table. We'll see how to allow a signed in user to update these additional user-specific properties and how to create a page to display information about a selected user. What's more, we'll look at using ASP.NET Routing to display user information using an SEO-friendly, human-readable URL like www.yoursite.com/Users/username. Read on to learn more! Read More >

    Read the article

  • Oracle’s AutoVue Enables Visual Decision Making

    - by Pam Petropoulos
    That old saying about a picture being worth a thousand words has never been truer.  Check out the latest reports from IDC Manufacturing Insights which highlight the importance of incorporating visual information in all facets of decision making and the role that Oracle’s AutoVue Enterprise Visualization solutions can play. Take a look at the excerpts below and be sure to click on the titles to read the full reports. Technology Spotlight: Optimizing the Product Life Cycle Through Visual Decision Making, August 2012 Manufacturers find it increasingly challenging to make effective product-related decisions as the result of expanded technical complexities, elongated supply chains, and a shortage of experienced workers. These factors challenge the traditional methodologies companies use to make critical decisions. However, companies can improve decision making by the use of visual decision making, which synthesizes information from multiple sources into highly usable visual context and integrates it with existing enterprise applications such as PLM and ERP systems. Product-related information presented in a visual form and shared across communities of practice with diverse roles, backgrounds, and job skills helps level the playing field for collaboration across business functions, technologies, and enterprises. Visual decision making can contribute to manufacturers making more effective product-related decisions throughout the complete product life cycle. This Technology Spotlight examines these trends and the role that Oracle's AutoVue and its Augmented Business Visualization (ABV) solution play in this strategic market. Analyst Connection: Using Visual Decision Making to Optimize Manufacturing Design and Development, September 2012 In today's environments, global manufacturers are managing a broad range of information. Data is often scattered across countless files throughout the product life cycle, generated by different applications and platforms. Organizations are struggling to utilize these multidisciplinary sources in an optimal way. Visual decision making is a strategy and technology that can address this challenge by integrating and widening access to digital information assets. Integrating with PLM and ERP tools across engineering, manufacturing, sales, and marketing, visual decision making makes digital content more accessible to employees and partners in the supply chain. The use of visual decision-making information rendered in the appropriate business context and shared across functional teams contributes to more effective product-related decision making and positively impacts business performance.

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 3 (sys.dm_exec_connections)

    - by Tamarick Hill
      The third DMV we will review is the sys.dm_exec_connections DMV. This DMV is Server-Scoped and displays information about each and every current connection on your SQL Server Instance. Lets take a look at some information that this DMV returns. SELECT * FROM sys.dm_exec_connections After reviewing this DMV, in my opinion, its not a whole lot of useful information returned from this DMV from a monitoring or troubleshooting standpoint. The primary use case I have for this DMV is when I need to get a quick count of how many connections I have on one of my SQL Server boxes. For this purpose a quick SELECT COUNT(*) satisfies my need. However, for those who need it, there is other information such as what type of authentication a specific connection is using, network packet size, and client/local TCP ports being used. This information can come in handy for specific scenarios but you probably wont need it very much for your day to day monitoring/troubleshooting needs. However, this is still an important DMV that you should be aware of in the event that you need it. For more information on this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms181509.aspx

    Read the article

  • Which Ubuntu version to use on a MAXDATA laptop Eco 3100X ? with this system info

    - by Erjet Malaj
    i am speaking as new ubuntu user, i just have installed ubuntu 10.04 on my laptop, but is running very slow... So i am here to ask you a question: WHich ubuntu version can fit for my laptop MAXDATA Eco 3100x, . My Laptop System Information are: SYSTEM INFORMATION Running Ubuntu Linux, the Ubuntu 10.04 (lucid) release. GNOME: 2.30.2 (Ubuntu 2010-06-25) Kernel version: 2.6.32-40-generic (#87-Ubuntu SMP Mon Mar 5 20:26:31 UTC 2012) GCC: 4.4.3 (i486-linux-gnu) Xorg: unknown (25 February 2012 06:59:39AM) (25 February 2012 06:59:39AM) Hostname: lotus-laptop Uptime: 0 days 1 h 6 min CPU INFORMATION GenuineIntel, Intel(R) Pentium(R) 4 CPU 2.40GHz Number of CPUs: 1 CPU clock currently at 2390.561 MHz with 512 KB cache Numbering: family(15) model(2) stepping(7) Bogomips: 4781.12 Flags: fpu vme de pse tsc msr pae mce cx8 mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe up pebs bts cid MEMORY INFORMATION Total memory: 228 MB Total swap: 455 MB STORAGE INFORMATION SCSI device - scsi0 Vendor: ATA Model: IBM-DJSA-210 SCSI device - scsi1 Vendor: TOSHIBA Model: DVD-ROM SD-C2502 HARDWARE INFORMATION MOTHERBOARD Host bridge Silicon Integrated Systems [SiS] 650/M650 Host (rev 11) PCI bridge(s) Silicon Integrated Systems [SiS] Virtual PCI-to-PCI bridge (AGP) Silicon Integrated Systems [SiS] Virtual PCI-to-PCI bridge (AGP) USB controller(s) Silicon Integrated Systems [SiS] USB 1.1 Controller (rev 0f) (prog-if 10) Silicon Integrated Systems [SiS] USB 1.1 Controller (rev 0f) (prog-if 10) Silicon Integrated Systems [SiS] USB 1.1 Controller (rev 0f) (prog-if 10) Silicon Integrated Systems [SiS] USB 2.0 Controller (prog-if 20) Silicon Integrated Systems [SiS] USB 1.1 Controller (rev 0f) (prog-if 10) Silicon Integrated Systems [SiS] USB 1.1 Controller (rev 0f) (prog-if 10) Silicon Integrated Systems [SiS] USB 1.1 Controller (rev 0f) (prog-if 10) Silicon Integrated Systems [SiS] USB 2.0 Controller (prog-if 20) ISA bridge Silicon Integrated Systems [SiS] SiS962 [MuTIOL Media IO] (rev 04) IDE interface Silicon Integrated Systems [SiS] 5513 [IDE] (prog-if 80 [Master]) Subsystem: Silicon Integrated Systems [SiS] 5513 [IDE] GRAPHIC CARD VGA controller Silicon Integrated Systems [SiS] 65x/M650/740 PCI/AGP VGA Display Adapter Subsystem: Uniwill Computer Corp Device 5103 SOUND CARD Multimedia controller Silicon Integrated Systems [SiS] AC'97 Sound Controller (rev a0) Subsystem: Uniwill Computer Corp Device 5203 NETWORK Ethernet controller Silicon Integrated Systems [SiS] SiS900 PCI Fast Ethernet (rev 91) Subsystem: Uniwill Computer Corp Device 5002 Modem Silicon Integrated Systems [SiS] AC'97 Modem Controller (rev a0) Subsystem: Uniwill Computer Corp Device 4003 Thanks you asap. :-) E

    Read the article

  • Jagran Prakashan Increases Staff Productivity by 40%

    - by Michael Snow
    Jagran Prakashan Increases Staff Productivity by 40%, Launches New IT Projects up to 4x Faster, Enables Mobile Service, and Improves Business Agility Oracle Customer: JPL Location:  Uttar Pradesh, India Industry: Media and Entertainment Employees:  10,000 Annual Revenue:  $100 to $500 Million Jagran Prakashan Ltd. (JPL) is one of India's premier media and communications groups with interests spanning print, advertising, event management, and mobile services for weather, cricket scores, and educational activities. It is a major media enterprise, with 300 locations across 15 states. Its impressive stable of print publications includes Dainik Jagran, the world’s most widely read daily newspaper––with a readership of over 55 million––the country’s leading afternoon dailies, and a range of popular local, bilingual, and English language newspapers. JPL was using multiple systems to manage its business processes. Users were resistant to using multiple passwords for various applications, preferring to continue their less efficient, legacy work practices. In addition, there was no single repository for sharing documents across the organization, such as company announcements or project documents. The company relied on e-mail to disseminate up-to-date company information, often missing employees. It was also time-consuming and difficult for managers to track the status of ongoing assignments or projects because collaboration and document sharing was inefficient and ineffective.With diverse businesses and many geographic locations, JPL needed to implement a centralized and user-friendly enterprise portal to improve document sharing and collaboration and increase business agility. The company implemented Oracle WebCenter Portal to create a dynamic, secure, and intuitive self-service enterprise portal to improve the user experience and increase operating efficiency. It improved staff productivity by 40%, accelerated new IT projects by up to 4x, boosted staff morale, and increased business agility.   Increases Staff Productivity by 40%, Launches New Products up to 2x Faster A word from JPL "With Oracle WebCenter Portal, we gained a dynamic, secure, and intuitive self-service enterprise portal that provided an exceptional user experience and enabled us to engage employees in a collaborative environment. It increased IT staff productivity by 40%, delivered new projects up to 4x faster, and enabled mobile service to improve our business agility.” Sarbani Bhatia, Vice President IT, Jagran Prakashahn Ltd Before implementing Oracle WebCenter Portal, JPL stored project-critical information, such as page planning of daily newspaper editions and the launch of new editions or supplements on individual laptops or in the e-mail system. Collaboration between colleagues was limited to physical meetings, telephone discussions, and e-mail. It was difficult to trace and recover important project documents when a staff member resigned, which represented a significant risk to business continuity. Employees were also averse to multiple passwords and resisted using the systems, affecting staff productivity. With Oracle WebCenter Portal, JPL created a dynamic, secure, and intuitive self-service enterprise portal with business activity streams. The portal allowed users to navigate, discover, and access information, such as advertising rates, requisition approvals, ad-hoc queries, and employee surveys from a single entry point with a single password. Managers can also upload important documents, such as new pricing for advertisers or newspaper distributors, and share them through the information and instruction section in the portal. In addition, managers can now easily track and review timelines for projects online rather than gathering information from meetings and e-mails. The company gained the ability to centrally manage information, ensured business continuity, and improved staff productivity by 40%.“In the media industry, news has a very short shelf life, so speed is crucial. Information delayed is like information lost,” said Sarbani Bhatia, vice president IT, Jagran Prakashahn Ltd. “Thanks to Oracle WebCenter Portal’s contextual collaboration tools, we can provide and share feedback for new project launches, such as career or education supplements, up to 2x faster through discussion forums or knowledge groups. Tasks that previously required four months, we now complete in one month.”In addition, the company can broadcast announcements, flash employee birthdays, and promote important events through the message section on the webpage, instead of using the e-mail system. The company can also conduct opinion polls to gauge employee response to organizational issues and improve management decision-making.“With over 10,000 employees across 300 locations, it is critical for management to hear the voice of employees and develop a cohesive organizational culture. Oracle WebCenter Portal enables employees to engage with business processes and systems in a collaborative environment, providing users with an exceptional experience,” Bhatia said. Enables Mobility Access and Increases Business Agility Newspaper advertisements generate the majority of JPL’s revenue. With most sales staff on the move, the company needed to ensure timely approval of print advertisement discounts for specific clients and meet tight publication deadlines.  By integrating Oracle WebCenter Portal seamlessly with its enterprise resource planning (ERP) system and other applications, such as the organizational mass mailing system, business intelligence, and management information system, JPL embedded its approval workflow processes into the enterprise portal and provided users with an integrated and intuitive interface. About 30% of JPL’s sales staff members now have tablets and receive advertising discount approval from managers while in the field and no longer need to return to the office, which has significantly improved efficiency and increased business agility.“Application mobility was critical for sales representatives in the field to meet stringent auditing requirements for online accountability, particularly for our newspaper advertising business. Staff member satisfaction has improved significantly now that the sales team can use tablets to access the portal––a capability we will extend to smart phones in the second stage of the implementation,” Bhatia said. Accelerates Application Development by up to 4x and Cuts Costs by up to 60% With Oracle WebCenter Portal, users can easily create, modify, and upload information to their personalized webpages without IT assistance. By seamlessly integrating Oracle WebCenter Portal with the payroll database, managers can decide which members of their team can access the page and with whom they will share information, a decision based on role or geographical location. A sales representative selling advertising space for a local language daily newspaper, for example, can upload an updated advertising rate relevant only to that particular publication. Users can also easily adapt to the new platform, thanks to its intuitive design and look, reducing the need for training and lowering resistance to using the system.Using Oracle WebCenter Portal’s out-of-the-box reusable components, such as portal pages and templates, provided JPL’s developers with a comprehensive and flexible user experience platform and increased the speed of application development. In less than five months, JPL developed more than 55 workflows. The IT team accelerated deployment of new applications by up to 4x, as they do not need to install them on individual machines now that they have a web-based environment.   “Previously, we would have spent a whole day deploying a new application for each department or location. With a browser-based environment, we have cut costs by up to 60% by reducing deployment time to zero, because our IT team can roll out a new application from a single point, thanks to Oracle WebCenter Portal,” Bhatia said. Challenges Provide a dynamic, secure, and intuitive self-service enterprise portal to improve staff productivity and ensure business continuity Enable seamless integration with multiple enterprise applications to improve workflow efficiency—including approval of print advertisement discounts—and increase business agility Improve engagement with employees and enable collaboration to enhance management decision-making Accelerate time-to-market for new services, such as new advertising programs Solutions Oracle Product and ServicesOracle WebCenter Portal 11g Increased staff productivity by 40% and enhanced user satisfaction by enabling employees to easily navigate, discover, and access information from a single, self-service enterprise portal without IT assistance Launched new products, such as career or education supplements, up to 2x faster by enabling peer collaboration and incorporating feedback generated through discussion forums, thanks to Oracle WebCenter Portal’s out-of-the-box collaboration tools Accelerated application development up to 4x by enabling developers to optimize reusable components for managing and deploying new applications in a browser-based environment rather than spending one day to install applications for each department, cutting costs by up to 60% Ensured business continuity by enabling managers to easily track and review project timelines online rather than storing important documents on individual laptops or relying on the e-mail system Increased business agility and operational efficiency by seamlessly integrating with the in-house, ERP system and embedding business processes into a single portal Boosted company revenue by enabling sales team members to submit print-advertising discount requests through mobile devices instead of waiting to return to office, ensuring timely approval from managers to meet tight publication deadlines Improved management decision-making by enabling employees to easily share and access feedback through opinion polls or forums, boosting staff morale Introduced the single sign-on capability and enhanced security by enabling managers to decide access level for staff members based on role or geographical location Reduced the need for staff training and minimized user resistance to systems by providing a dynamic and intuitive user experience Why Oracle JPL did not consider other products because the company was already using Oracle Database, Enterprise Edition with Real Application Clusters and had a positive experience with Oracle. JPL chose Oracle WebCenter Portal to ensure no compatibility issues for integration with its existing Oracle products and to take advantage of the experience and support of a reputable vendor to ensure business continuity. “We chose Oracle because we knew we could rely on its support and experience. In addition, Oracle WebCenter Portal’s speed, agility, and mobile access features were a perfect fit for our business requirements,” Bhatia said. Implementation Process JPL launched the enterprise portal to 500 users in the first phase of the project, and plans to extend this to 2,000 users when the portal is fully launched. Oracle partner PricewaterhouseCoopers used Oracle Application Development Framework for the intial set-up, user training and to develop and design sample workflows. JPL’s internal IT staff then took charge of the implementation, bringing it to completion on budget. Partner Oracle PartnerPricewaterhouseCoopers (India)

    Read the article

  • Atelier gratuit : Découvrir la solution d'exploration de données structuré et non structuré

    - by David lefranc
    Explorer et découvrir l’information… Nous vous proposons un atelier découverte pour vous permettre d’explorer toute type de données grace à la solution Oracle Endeca Information Discovery. Quand : 7 Décembre 2012 De 9h30 à 12h30  Lieu : Oracle 15 Boulevard Charles de gaulle 92715 Colombes Pour s'inscrire : [email protected] Réalisé pour des utilisateurs métiers, cet atelier vous permettera en une demi journée , de découvrir Oracle Endeca Information Discovery afin de : Comprendre et explorer toute information venant de différents horizons ( Big Data, réseaux sociaux, forums, sondages, blogs..) Découvrir en quoi et comment OEID est un complément à des solutions de BI classiques Par une navigation simple et rapide, vous découvrirez combien il est facile de trouver des réponses à des questions imprévues en utilisant OEID sans formation préalable. Utilisez la recherche et la navigation guidée pour voir comment les informations structurées et non structurées peuvent être rapidement réunies pour dégager la valeur cachée. Explorer toutes vos données dans n'importe quel format et à partir de n'importe quelle source, y compris les médias sociaux, documents, fichiers,…. Pouvoir découvrir et explorer vos données sans référentiel pour permettre aux utilisateurs d’être autonome et d’analyser leurs propres données de manière rapide Élaborer une stratégie visant à accroître la valeur des données de l'entreprise tout en réduisant le coût total de possession Découvrez l'incroyable performance d’ Endeca sur Oracle Exalytics la machine In Memory Agenda Après une introduction sur la solution Oracle information Endeca, suivi d’un atelier, vous verrez comment il est facile de: Utiliser la navigation guidée et le moteur de recherche pour explorer les données structurées et non structurées intégrer rapidement les nouvelles sources de données comme les médias sociaux Construire de nouvelles interfaces utilisateur tout en découvrant l’information répondre rapidement aux besoins changeants des entreprises et des environnements de données Quand Lieu 7 Décembre 2012 De 9h30 à 12h30 Oracle 15 Boulevard Charles de gaulle 92715 Colombes

    Read the article

  • Is it possible to transfer a domain without a "gap" in Whois privacy protection?

    - by Guest
    I currently own several domains on which I am using a Whois privacy protection service to hide my personal details. In the near future, I would like to transfer some of these domains to a different registrar. It has been many years since I last performed domain transfers, so I am no longer knowledgeable about what it involves. However, I have read from several registrars that they ask their customers to disable Whois protection before effecting a domain transfer. Since there are several websites out there that publish archived versions of Whois information (and ask handsome money for the information to be hidden, of course), I would prefer to avoid having such a "gap" in my privacy protection. I figured that these websites would fetch Whois information mainly when a query is effected through their own website. However, I have found out that at least one of these sites had a copy of the Whois information for a new domain up on their site within hours after I registered it, so they must have some other source (of course I used a Google search to find that out, not their own site). What that tells me is that the time it takes for the domain transfers to go through would be more than enough for these rogue websites to cache my information. If my new registrar offers privacy protection for domains right from the point of registration as well, is there no way to transfer the domain between the two without reverting to my default Whois information in between?

    Read the article

  • Sendmail encrypted

    - by user1948828
    I manage a website running on Apache. It has public and private areas. When people apply for an account to access the protected portions of the site, they do a TLS/SSL protected POST containing their information which is saved to a (hopefully) nonpublic directory on the server. Then I have a python script which takes URL Encoded POSTS with this user information, sends back a plaintext confirmation to the applicant, encrypts their information with a freeware java command-line utility to protect it (specifically this one: http://spi.dod.mil/ewizard.htm), base64 encodes them, puts them in a file as a mime attachment and uses sendmail to forward the file information to my (and several coworkers' scattered around the country) email account(s) on an Exchange server with Outlook clients. This has worked well for years, but is awkward because it involves manually decrypting the information on a windows box once it is received, using the above mentioned encryption utility. This significantly limits how many can be processed. I would like to be able to encrypt my information in a format that Outlook/Exchange can inherently understand and display so that these emails can be viewed simply by clicking on them. I do have company provided PKI public certs for all the people I need to send to, and am able to send/receive encrypted emails on Outlook manually, but would like to know how I can send to Outlook from apache/linux/python from the command line using the same PKI certs. Dont need to receive them, just send. Is there a utility that can do this? I had thought pgp might but I havent been able to figure it out.

    Read the article

  • Simple (I hope) Excel question about

    - by Princess
    I am doing a directory for my neighborhood. We had most of the information from a previous directory. The information was entered: A1 name, B1 address and C1 phone number; B1 name, B2 address, C2 phone number etc. The publisher wants the information in a different format A1 name, A2 address, A3 phone number, A4 blank; A5 name, A6 address, A7 phone number, A8 blank etc... Is there an easy (or heck - a not so easy) way to have Excel change the format of the information without me having to hand type 1300 households information? I will also need to reformat the information a second time into a crisscross. The format for that one is: A1 Street name, A2 Address Number, B2 Resident Name and C2 Phone number.

    Read the article

  • What's the lastest version of Openstack that I can deploy with Juju? Where can find this information?

    - by Azendale
    I've been trying to deploy openstack. I thought Havana was the latest, but deployed nova cloud controller, and the log said 2013-10-24 19:13:25 INFO juju juju-log.go:66 nova-cloud-controller/0: FATAL ERROR: Invalid Cloud Archive release specified: precise-updates/havana. Is there somewhere to see what the latest Juju deployable (with the standard repositories) version is? Or am I just using the wrong format when I specify openstack-origin: cloud:precise-updates/havana in the config?

    Read the article

  • I cannot install wine 1.5 on ubuntu 12.04 x64

    - by user26286
    I tried: $ sudo add-apt-repository ppa:ubuntu-wine/ppa $ sudo apt-get update $ sudo apt-get install wine1.5 result was: $ sudo apt-get install wine1.5 Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: wine1.5 : Depends: wine1.5-i386 (= 1.5.6-0ubuntu1~pulse17) E: Unable to correct problems, you have held broken packages. than I tried: $ sudo dpkg --configure -a $ sudo apt-get install -f result was: $ sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. after all I tried: $ sudo apt-get build-dep wine1.5 Reading package lists... Done Building dependency tree Reading state information... Done The following packages have unmet dependencies: libgnutls-dev : Depends: libgnutls26 (= 2.12.14-5ubuntu3) but 2.12.19-1 is to be installed Depends: libgnutlsxx27 (= 2.12.14-5ubuntu3) but it is not going to be installed Depends: libgnutls-openssl27 (= 2.12.14-5ubuntu3) but it is not going to be installed E: Build-dependencies for wine1.5 could not be satisfied. also tried: $ sudo apt-get install -y ia32-libs Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: ia32-libs : Depends: ia32-libs-multiarch E: Unable to correct problems, you have held broken packages. Somebody can help me? Thanks.

    Read the article

  • comparing two tables [closed]

    - by sza
    I have two identical table ie all the columns are identical and one of the datatype is Text, one is varchar(255) and the rest are int. Lets say the table name is 'AAAAA'. Table AAAAA was processed and backed up earlier this month. Both the tables were storing data and now the second table is only storing data. I need to find unmatching records from the second table (BBBBB) which is storing data right now and add those records to Table AAAAA. Your help will be highly appreciated. I tried to use 'EXCEPT' but it does not support text datatype.

    Read the article

  • How should I implement Transaction database EJB 3.0

    - by JamesBoyZ
    In the CustomerTransactions entity, I have the following field to record what the customer bought: @ManyToMany private List<Item> listOfItemsBought; When I think more about this field, there's a chance it may not work because merchants are allowed to change item's information (e.g. price, discount, etc...). Hence, this field will not be able to record what the customer actually bought when the transaction occurred. At the moment, I can only think of 2 ways to make it work. I will record the transaction details into a String field. I feel that this way would be messy if I need to extract some information about the transaction later on. Whenever the merchant changes an item's information, I will not update directly to that item's fields. Instead, I will create another new item with all the new information and keep the old item untouched. I feel that this way is better because I can easily extract information about the transaction later on. However, the bad side is that my Item table may contain a lot of rows. I'd be very grateful if someone could give me an advice on how I should tackle this problem. UPDATE: I'd like to add more information about the current design. public class Customer implements Serializable { @OneToMany private List<CustomerTransactions> listOfTransactions; } public class CustomerTransactions implements Serializable { @ManyToMany private List<Item> listOfItemsBought; } public class Merchant implements Serializable { @OneToMany private List<Item> listOfSellingItems; }

    Read the article

  • Insert Registration Data in MySQL using PHP

    - by J M 4
    I may not be asking this in the best way possible but i will try my hardest. Thank you ahead of time for your help: I am creating an enrollment website which allows an individual OR manager to enroll for medical testing services for professional athletes. I will NOT be using the site as a query DB which anybody can view information stored within the database. The information is instead simply stored, and passed along in a CSV format to our network provider so they can use as needed after the fact. There are two possible scenarios: Scenario 1 - Individual Enrollment If an individual athlete chooses to enroll him/herself, they enter their personal information, submit their payment information (credit/bank account) for processing, and their information is stored in an online database as Athlete1. Scenario 2 - Manager Enrollment If a manager chooses to enroll several athletes he manages/ promotes for, he enters his personal information, then enters the personal information for each athlete he wishes to pay for (name, address, ssn, dob, etc), then submits payment information for ALL athletes he is enrolling. This number can range from 1 single athlete, up to 20 athletes per single enrollment (he can return and complete a follow up enrollment for additional athletes). Initially, I was building the database to house ALL information regardless of enrollment type in a single table which housed over 400 columns (think 20 athletes with over 10 fields per athlete such as name, dob, ssn, etc). Now that I think about it more, I believe create multiple tables (manager(s), athlete(s)) may be a better idea here but still not quite sure how to go about it for the following very important reasons: Issue 1 If I list the manager as the parent table, I am afraid the individual enrolling athlete will not show up in the primary table and will not be included in the overall registration file which needs to be sent on to the network providers. Issue 2 All athletes being enrolled by a manager are being stored in SESSION as F1FirstName, F2FirstName where F1 and F2 relate to the id of the fighter. I am not sure technically speaking how to store multiple pieces of information within the same table under separate rows using PHP. For example, all athleteswill have a first name. The very basic theory of what i am trying to do is: If number_of_athletes 1, store F1FirstName in row 1, column 1 of Table "Athletes"; store F1LastName in row 1, column 2 of Table "Athletes"; store F2FirstName in row 2, column 1 of Table "Athletes"; store F2LastName in row 2, column 2 of table "Athletes"; Does this make sense? I know this question is very long and probably difficult so i appreciate the guidance.

    Read the article

  • Data from two tables without repeating data from the first?

    - by Aran
    I have two tables. Users table and Users Meta Table I am looking for a way to get all the information out of both tables with one query. But without repeating the information from Users table. This is all information relating to the users id number as well. So for example user_id = 1. Is there a way to query the database and collect all the information I from both tables without repeating the information from the first?

    Read the article

  • Single entity with single view or two views in mvc3 vs2010?

    - by user2905798
    I have the following entity model public class Employee { public int Employee ID{get;set;} public string employeename{get;set;} public datetime employeeDOb{get;set;} public datetime? employeeDateOfJoin{get;set;} public string empFamilyname{get;set;} public datetime empFamilyDob{get;set;} } here I have to design a view for collecting employee information and employee family information. Since I am working on already available data, where in empFamilyDob was not mandatory. But now it is being made mandatory, the previous data doesn't contain EmpFamilyDob. So naturally I have added this new property EmpFamilyDob to the Model and made it required through DataAnnotations. Now there are two set of views to be developed. 1. A view which simply allows to collect the employee information without employee family information. i.e, empFamilyName and EmpFamilyDob.--This view is used by the Hr section to insert empplyee details Since the empFamilyname and EmpFamilyDob being now made mandatory, some other section will edit the data and update the EmpFamilyName and EmpFamilyDob as and when the information about employee family details are received. I have action controller for CreateNew and Edit Which is being generated by using the default model. There are two user actions being performed. 1.When the user clicks the Create new -- he will be able to update only the Employee information 2.As and when the other section receives the employee family details they update the familyname and family date of birth. i.e, EmployeeFamilyname and EmployeFamilyDob. While creating new record the uses should be able to update employee information only and while editing the information he should be able to update the employeefamily information. Since I have a single view with most of these fields as required and not allowing null , How can I achieve this in a sincle view? I have recorrected the model like this public class Employee { public int Employee ID{get;set;} public string employeename{get;set;} public datetime employeeDOb{get;set;} public datetime? employeeDateOfJoin{get;set;} public string empFamilyname{get;set;} public datetime? empFamilyDob{get;set;} } Now by default I hope the createnew action would insert null value for empFamilyname(string datatype) and empFamilyDob . In the Edit action the user should be made to enter empFamilyname and empFamilyDob(mandatory). As there is every chance that the user might edit other information about the employee(like employeeDob) I don't want to go for partial views. Can you help me out with some illustration. Thanks in advance

    Read the article

  • Add an Opera Style Status Bar to Firefox

    - by Asian Angel
    Anyone who has used Opera will be familiar with the information presented for the webpage that is currently loading in the browser (i.e. number of images loaded). If you would like to have that same functionality in Firefox then join us as we look at the Extended Statusbar extension. Before Here is the default setup for Firefox…not a lot of information available to indicate exactly how much of the webpage has already loaded versus what has not. For some people this is enough but what if you like more details? Extended Statusbar in Action You may be curious about the information that the Extended Statusbar extension will provide. The information includes: Percentage of the webpage loaded The number of images loaded Bytes downloaded Average download speed The load time After emptying the cache we once again reloaded the HTG homepage. The default style/mode is “Classic Style” and the “webpage load information” will be displayed within your “Status Bar” as shown here. The information available after the webpage finished loading in “Classic Style”. If you prefer “Slim Mode” this is how your “Status Bar” should look afterwards…very condensed. For those preferring the “New Style” a temporary addition will appear above your regular “Status Bar” and disappear just a few seconds after the webpage has fully loaded (unless changed in the “Settings”). Settings The “Settings” are set up in two different ways. For those who prefer to use the “Classic Style & Slim Mode” these are the options available to you. If you prefer the “New Style” then you will have a whole different set of options available. Notice that you can exclude certain webpages and set a custom style if desired. Conclusion If you have been wanting to add Opera style webpage loading information to your “Status Bar” then you should definitely give this extension a try. Links Download the Extended Statusbar extension (Mozilla Add-ons) Similar Articles Productive Geek Tips Move the Progress Bar to the Tabs in FirefoxSet the Speed Dial as the Opera Startup PageAuto-Hide Your Cluttered Firefox Status Bar ItemsSimplify Text Copying & Pasting in Firefox with AutoCopyScan Files for Viruses Before You Download With Dr.Web TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 If it were only this easy SyncToy syncs Files and Folders across Computers on a Network (or partitions on the same drive) Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook Windows 7 Easter Theme YoWindoW, a real time weather screensaver

    Read the article

  • JCP EC Nomination Materials for 2012

    - by heathervc
    The nomination period of the 2012 Annual JCP EC Elections will begin at the end of September 2012.  The JCP will be accept self-nominations for 2 seats on what will become the merged JCP EC, starting 28 September, with the nomination period ending on Thursday, 11 October. JCP Members (JSPA 2 primary contacts) will receive messages with instructions for nominating and their login credentials via email.  You will need this credential information to login and complete the nomination.The JCP EC Special Election schedule is posted online in the JCP calendar, highlights are below:Nominations for elected seats: 28 September-11 OctoberBallot (ratified and elected): 16-29 OctoberNew members take office: 13 November The ballot with nominees for ratified and nominated seats begins on 16 October. The results will also be available on jcp.org on 30 October. If you are attending JavaOne 2012 in San Francisco, there are several events happening that you may be interested in attending, in particular the following BOF session.Meet the JCP Executive Committee CandidatesSession ID: BOF6307Location: Hilton San Francisco - Golden Gate 3/4/5Date and Time: 10/2/12, 4:30 PM - 5:15 PM We will also be hosting a call for all of the candidates following the nomination period.  The following information is required for self-nomination.1) Contact information/Biography Each EC seat is represented by two people - a primary and alternate representative. Provide the following information for each representative: - Name - Title - Email Address - Mailing Address - Phone Number - Fax - A brief biography (3-5 sentences/~100-200 words) for primary contact - Photograph (prefer jpg format, head only shot) for primary contactBios and photos for the EC members are posted here:http://jcp.org/en/press/news/ec-feature_MEhttp://jcp.org/en/press/news/ec-feature_SE2) Qualification StatementA brief (2-3 paragraph) description of your qualifications for an EC seat; this is a Qualification Statement for the organization you represent. It should include the value and perspective you bring to the EC, your interests in the JCP program, as well as a summary of your current participation or planned participation in the JCP program (your entire organization)--JSRs, participation on Expert Groups, meetings/events attended, etc.  This statement will appear on the ballot and will convince community members that they should vote for you, so please include relevant information about your experience within the JCP program and your investments in Java technology.A few sample qualification statements are available here.3) Position PaperOne of the pieces of information we make available to the JCP membership for voting purposes is a position paper.  If you would like to provide this type of information for the ballot, please prepare in pdf format for posting.  This would be more detail on areas that you would put focus into during your tenure on the JCP EC.You can read more about some of the topics under discussion in the EC here, including links to JCP.Next materials. If you have an interest in participating in the JCP EC, please start preparing these materials now.  We look forward to a successful election process.

    Read the article

  • Upgrading 10.04LTS -> 10.10 using custom sources

    - by Boatzart
    I'm trying to upgrade to 10.10 from 10.04 LTS using a custom sources.list file that points to an unofficial mirror*. The mirror does have maverick, but I get the following output when upgrading: boatzart@somecomputer: > sudo do-release-upgrade Checking for a new ubuntu release Done Upgrade tool signature Done Upgrade tool Done downloading extracting 'maverick.tar.gz' authenticate 'maverick.tar.gz' against 'maverick.tar.gz.gpg' tar: Removing leading `/' from member names Reading cache Checking package manager Reading package lists... Done Building dependency tree Reading state information... Done Building data structures... Done Reading package lists... Done Building dependency tree Reading state information... Done Building data structures... Done Updating repository information WARNING: Failed to read mirror file No valid mirror found While scanning your repository information no mirror entry for the upgrade was found. This can happen if you run a internal mirror or if the mirror information is out of date. Do you want to rewrite your 'sources.list' file anyway? If you choose 'Yes' here it will update all 'lucid' to 'maverick' entries. If you select 'No' the upgrade will cancel. Continue [yN] y WARNING: Failed to read mirror file 96% [Working] Checking package manager Reading package lists... Done Building dependency tree Reading state information... Done Building data structures... Done Calculating the changes Calculating the changes Could not calculate the upgrade An unresolvable problem occurred while calculating the upgrade: The package 'update-manager-kde' is marked for removal but it is in the removal blacklist. This can be caused by: * Upgrading to a pre-release version of Ubuntu * Running the current pre-release version of Ubuntu * Unofficial software packages not provided by Ubuntu If none of this applies, then please report this bug against the 'update-manager' package and include the files in /var/log/dist-upgrade/ in the bug report. Restoring original system state Aborting Reading package lists... Done Building dependency tree Reading state information... Done Building data structures... Done Here is the relevant section from /var/log/dist-upgrade/main.log: 2010-11-18 14:05:52,117 DEBUG The package 'update-manager-kde' is marked for removal but it's in the removal blacklist 2010-11-18 14:05:52,136 ERROR Dist-upgrade failed: 'The package 'update-manager-kde' is marked for removal but it is in the removal blacklist.' 2010-11-18 14:05:52,136 DEBUG abort called *I'm located inside of USC, and for some crazy reason any sustained downloads to anywhere outside of the University are throttled down to 5kbps inside of my lab. Because of this I need to use the following sources.list: deb http://mirrors.usc.edu/pub/linux/distributions/ubuntu/ lucid main restricted universe multiverse deb http://mirrors.usc.edu/pub/linux/distributions/ubuntu/ lucid-updates main restricted universe multiverse deb http://mirrors.usc.edu/pub/linux/distributions/ubuntu/ lucid-backports main restricted universe multiverse deb http://mirrors.usc.edu/pub/linux/distributions/ubuntu/ lucid-security main restricted universe multiverse I've tried adding four more entries to the sources.list with s/lucid/maverick/ but that didn't help. Does anyone know how to fix this? Thanks!

    Read the article

  • WebCenter Customer Spotlight: Azul Brazilian Airlines

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryAzul Linhas Aéreas Brasileiras (Azul Brazilian Airlines) is the third-largest airline in Brazil serving  42 destinations with a fleet of 49 aircraft and employs 4,500 crew members. The company wanted to offer an innovative site with a simple purchasing process for customers to search for and buy tickets and for the company’s marketing team to more effectively conduct its campaigns. To this end, Azul implemented Oracle WebCenter Sites, succeeding in gathering all of the site’s key information onto a single platform. Azul can now complete the Web site content updating process—which used to take approximately 48 hours—in less than five minutes. Company OverviewAzul Linhas Aéreas Brasileiras (Azul Brazilian Airlines) has established itself as the third-largest airline in Brazil, based on a business model that combines low prices with a high level of service. Azul serves 42 destinations with a fleet of 49 aircraft. It operates 350 daily flights with a team of 4,500 crew members. Last year, the company transported 15 million passengers, achieving a 10% share of the Brazilian market, according to the Agência Nacional de Aviação Civil (ANAC, or the National Civil Aviation Agency). Business ChallengesThe company wanted to offer an innovative site with a simple purchasing process for customers to search for and buy tickets and for the company’s marketing team to more effectively conduct its campaigns. Provide customers with an  innovative Web site with a simple process for purchasing flight tickets Bring dynamism to the Web site’s content updating process to provide autonomy to the airline’s strategic departments, such as marketing and product development Facilitate integration among the site’s different application providers, such as ticket availability and payment process, on which ticket sales depend Solution DeployedAzul worked with the  Oracle partner TQI to implement Oracle WebCenter Sites, succeeding in gathering all of the site’s key information onto a single platform. Previously, at least three servers and corporate information environments had directed data to the portal. The single Oracle-based platform now facilitates site updates, which are daily and constant. Business Results Gained development freedom in all processes—from implementation to content editing Gathered all of the Web site’s key information onto a single platform, facilitating its daily and constant updating, whereas the information was previously spread among at least three IT environments and had to go through a complex process to be made available online to customers Reduced time needed to update banners and other Web site content from an average of 48 hours to less than five minutes Simplified the flight ticket sales process thanks to tool flexibility that enabled the company to improve Website usability “Oracle WebCenter Sites provides an easy-to-use platform that enables our marketing department to spend less time updating content and more time on innovative activities. Previously, it would take 48 hours to update content on our Web site; now it takes less than five minutes. We have shown the market that we are innovators, enabling customer convenience through an improved flight ticket purchase process.” Kleber Linhares, Information Technology and E-Commerce Director, Azul Linhas Aéreas Brasileiras Additional Information Azul Brazilian Airlines Case Study Oracle WebCenter Sites Oracle WebCenter Sites Satellite Server

    Read the article

  • WebCenter Customer Spotlight: Alberta Agriculture and Rural Developmen

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryAlberta Agriculture and Rural Development is a government ministry that works with producers and consumers to create a strong, competitive, and sustainable agriculture and food industry in the province of Alberta, Canada The primary business challenge faced by the Alberta Ministry of Agriculture was that of managing the rapid growth of their information.  They needed to incorporate a system that would work across 22 different divisions within the ministry and deliver an improved and more efficient experience for Desktop, Web and Mobile users, while addressing their regulatory compliance needs as part of the Canadian government. The customer implemented a centralized Enterprise Content Management solution based on Oracle WebCenter Content and developed a strong and repeatable information life cycle management methodology across all their 22 divisions and agencies. With the implemented solution, Alberta Agriculture and Rural Development  centrally manages over 20 million documents for 22 divisions and agencies and they have improved time required to find records,  reliability of information, improved speed and accuracy of reporting and data security. Company OverviewAlberta Agriculture and Rural Development is a government ministry that works with producers and consumers to create a strong, competitive, and sustainable agriculture and food industry in the province of Alberta, Canada.  Business ChallengesThe business users were overwhelmed by growth in documents (over 20 million files across 22 divisions and agencies) and it was difficult to find and manage documents and versions. There was a strong need for a personalized easy-to-use, secure and dependable method of managing and consuming content via desktop, Web, and mobile, while improving efficiency and maintaining regulatory compliance by removing the risk of non-uniform approaches to retention and disposition. Solution DeployedAs a first step Alberta Agriculture and Rural Development developed a business case with clear defined business drivers: Reduce time required to find records Locate “lost” records Capture knowledge lost through attrition Increase the ease of retrieval Reduce personal copies Increase reliability of information Improve speed and accuracy of reporting Improve data security The customer implemented a centralized Enterprise Content Management solution based on Oracle WebCenter Content. They used an incremental implementation approach aligned with their divisional and agency structure which allowed continuous process improvement. This led to a very strong and repeatable information life cycle management methodology across all their 22 divisions and agencies. Business ResultsAlberta Agriculture and Rural Development achieved impressive business results: Centrally managing over 20 million files for 22 divisions and agencies Federated model to manage documents in SharePoint and other applications Doing records management for both paper and electronic records Reduced time required to find records Increased the ease of retrieval Increased reliability of information Improved speed and accuracy of reporting Improved data security Additional Information Oracle Open World 2012 Presentation Oracle WebCenter Content

    Read the article

  • JDK 8u20 Documentation Updates

    - by joni g.
    JDK 8u20 has been released and is available from the Java Downloads page. See the JDK 8u20 Update Release Notes for details. Highlights for this release: The Medium security level has been removed. Now only High and Very High levels are available. Applets that do not conform with the latest security practices can still be authorized to run by adding the sites that host them to the Exception Site List. See Security for more information. The javafxpackager tool has been renamed to javapackager, and supports both Java and JavaFX applications. The -B option has been added to the javapackager deploy command to enable arguments to be passed to the bundlers that are used to create self-contained applications. See javapackager for Windows or Linux and OS X for information. The <fx:bundleArgument> helper parameter argument has been added to enable arguments to be passed to the bundlers when using ant tasks. See JavaFX Ant Task Reference for more information. A new attribute is available for JAR file manifests. The Entry-Point attribute is used to identify the classes that are allowed to be used as entry points to your application. See Entry-Point Attribute for more information. A new Microsoft Windows Installer (MSI) Enterprise JRE Installer, which enables users to install the JRE across the enterprise, is available for Java SE Advanced or Java SE Suite licensees. See Downloading the Installer in JRE Installation For Microsoft Windows for more information. The following new configuration parameters are added to the installation process to support commercial features, for use by Java SE Advanced or Java SE Suite licensees only: USAGETRACKERCFG= DEPLOYMENT_RULE_SET= See Installing With a Configuration File for more information about these and other installer parameters. Documentation highlights: New Troubleshooting Guide combines and replaces the Desktop Technologies Troubleshooting Guide and the HotSpot Virtual Machine Troubleshooting Guide to provide a single location for diagnosing and solving problems that might occur with Java Client applications. New Deployment Guide combines and replaces the JavaFX Deployment Guide and the Java Rich Internet Applications Guide to provide a single location for information about the Java packaging tools, creating self-contained applications, and deploying Java and JavaFX applications. New Garbage Collection Tuning Guide describes the garbage collectors included with the Java HotSpot VM and helps you choose which one to use. The Java Tutorials have a new look.

    Read the article

  • Logging connection strings

    If you some of the dynamic features of SSIS such as package configurations or property expressions then sometimes trying to work out were your connections are pointing can be a bit confusing. You will work out in the end but it can be useful to explicitly log this information so that when things go wrong you can just review the logs. You may wish to develop this idea further and encapsulate such logging into a custom task, but for now lets keep it simple and use the Script Task. The Script Task code below will raise an Information event showing the name and connection string for a connection. Imports System Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim fireAgain As Boolean ' Get the connection string, we need to know the name of the connection Dim connectionName As String = "My OLE-DB Connection" Dim connectionString As String = Dts.Connections(connectionName).ConnectionString ' Format the message and log it via an information event Dim message As String = String.Format("Connection ""{0}"" has a connection string of ""{1}"".", _ connectionName, connectionString) Dts.Events.FireInformation(0, "Information", message, Nothing, 0, fireAgain) Dts.TaskResult = Dts.Results.Success End Sub End Class Building on that example it is probably more flexible to log all connections in a package as shown in the next example. Imports System Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim fireAgain As Boolean ' Loop through all connections in the package For Each connection As ConnectionManager In Dts.Connections ' Get the connection string and log it via an information event Dim message As String = String.Format("Connection ""{0}"" has a connection string of ""{1}"".", _ connection.Name, connection.ConnectionString) Dts.Events.FireInformation(0, "Information", message, Nothing, 0, fireAgain) Next Dts.TaskResult = Dts.Results.Success End Sub End Class By using the Information event it makes it readily available in the designer, for example the Visual Studio Output window (Ctrl+Alt+O) or the package designer Execution Results tab, and also allows you to readily control the logging by choosing which events to log in the normal way. Now before somebody starts commenting that this is a security risk, I would like to highlight good practice for building connection managers. Firstly the Password property, or any other similar sensitive property is always defined as write-only, and secondly the connection string property only uses the public properties to assemble the connection string value when requested. In other words the connection string will never contain the password. I have seen a couple of cases where this is not true, but that was just bad development by third-parties, you won’t find anything like that in the box from Microsoft.   Whilst writing this code it made me wish that there was a custom log entry that you could just turn on that did this for you, but alas connection managers do not even seem to support custom events. It did however remind me of a very useful event that is often overlooked and fits rather well alongside connection string logging, the Execute SQL Task’s custom ExecuteSQLExecutingQuery event. To quote the help reference Custom Messages for Logging - Provides information about the execution phases of the SQL statement. Log entries are written when the task acquires connection to the database, when the task starts to prepare the SQL statement, and after the execution of the SQL statement is completed. The log entry for the prepare phase includes the SQL statement that the task uses. It is the last part that is so useful, how often have you used an expression to derive a SQL statement and you want to log that to make sure the correct SQL is being returned? You need to turn it one, by default no custom log events are captured, but I’ll refer you to a walkthrough on setting up the logging for ExecuteSQLExecutingQuery by Jamie.

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 22 (sys.dm_db_index_physical_stats)

    - by Tamarick Hill
    The sys.dm_db_index_physical_stats Dynamic Management Function is used to return information about the fragmentation levels, page counts, depth, number of levels, record counts, etc. about the indexes on your database instance. One row is returned for each level in a given index, which we will discuss more later. The function takes a total of 5 input parameters which are (1) database_id, (2) object_id, (3) index_id, (4) partition_number, and (5) the mode of the scan level that you would like to run. Let’s use this function with our AdventureWorks2012 database to better illustrate the information it provides. SELECT * FROM sys.dm_db_index_physical_stats(db_id('AdventureWorks2012'), NULL, NULL, NULL, NULL) As you can see from the result set, there is a lot of beneficial information returned from this DMF. The first couple of columns in the result set (database_id, object_id, index_id, partition_number, index_type_desc, alloc_unit_type_desc) are either self-explanatory or have been explained in our previous blog sessions so I will not go into detail about these at this time. The next column in the result set is the index_depth which represents how deep the index goes. For example, If we have a large index that contains 1 root page, 3 intermediate levels, and 1 leaf level, our index depth would be 5. The next column is the index_level which refers to what level (of the depth) a particular row is referring to. Next is probably one of the most beneficial columns in this result set, which is the avg_fragmentation_in_percent. This column shows you how fragmented a particular level of an index may be. Many people use this column within their index maintenance jobs to dynamically determine whether they should do REORG’s or full REBUILD’s of a given index. The fragment count represents the number of fragments in a leaf level while the avg_fragment_size_in_pages represents the number of pages in a fragment. The page_count column tells you how many pages are in a particular index level. From my result set above, you see the the remaining columns all have NULL values. This is because I did not specify a ‘mode’ in my query and as a result it used the ‘LIMITED’ mode by default. The LIMITED mode is meant to be lightweight so it does collect information for every column in the result set. I will re-run my query again using the ‘DETAILED’ mode and you will see we now have results for these rows. SELECT * FROM sys.dm_db_index_physical_stats(db_id('AdventureWorks2012'), NULL, NULL, NULL, ‘DETAILED’)   From the remaining columns, you see we get even more detailed information such as how many records are in a particular index level (record_count). We have a column for ghost_record_count which represents the number of records that have been marked for deletion, but have not physically been removed by the background ghost cleanup process. We later see information on the MIN, MAX, and AVG record size in bytes. The forwarded_record_count column refers to records that have been updated and now no longer fit within the row on the page anymore and thus have to be moved. A forwarded record is left in the original location with a pointer to the new location. The last column in the result set is the compressed_page_count column which tells you how many pages in your index have been compressed. This is a very powerful DMF that returns good information about the current indexes in your system. However, based on the mode you select, it could be a very resource intensive function so be careful with how you use it. For more information on this Dynamic Management Function, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms188917.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • Logging connection strings

    If you some of the dynamic features of SSIS such as package configurations or property expressions then sometimes trying to work out were your connections are pointing can be a bit confusing. You will work out in the end but it can be useful to explicitly log this information so that when things go wrong you can just review the logs. You may wish to develop this idea further and encapsulate such logging into a custom task, but for now lets keep it simple and use the Script Task. The Script Task code below will raise an Information event showing the name and connection string for a connection. Imports System Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim fireAgain As Boolean ' Get the connection string, we need to know the name of the connection Dim connectionName As String = "My OLE-DB Connection" Dim connectionString As String = Dts.Connections(connectionName).ConnectionString ' Format the message and log it via an information event Dim message As String = String.Format("Connection ""{0}"" has a connection string of ""{1}"".", _ connectionName, connectionString) Dts.Events.FireInformation(0, "Information", message, Nothing, 0, fireAgain) Dts.TaskResult = Dts.Results.Success End Sub End Class Building on that example it is probably more flexible to log all connections in a package as shown in the next example. Imports System Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim fireAgain As Boolean ' Loop through all connections in the package For Each connection As ConnectionManager In Dts.Connections ' Get the connection string and log it via an information event Dim message As String = String.Format("Connection ""{0}"" has a connection string of ""{1}"".", _ connection.Name, connection.ConnectionString) Dts.Events.FireInformation(0, "Information", message, Nothing, 0, fireAgain) Next Dts.TaskResult = Dts.Results.Success End Sub End Class By using the Information event it makes it readily available in the designer, for example the Visual Studio Output window (Ctrl+Alt+O) or the package designer Execution Results tab, and also allows you to readily control the logging by choosing which events to log in the normal way. Now before somebody starts commenting that this is a security risk, I would like to highlight good practice for building connection managers. Firstly the Password property, or any other similar sensitive property is always defined as write-only, and secondly the connection string property only uses the public properties to assemble the connection string value when requested. In other words the connection string will never contain the password. I have seen a couple of cases where this is not true, but that was just bad development by third-parties, you won’t find anything like that in the box from Microsoft.   Whilst writing this code it made me wish that there was a custom log entry that you could just turn on that did this for you, but alas connection managers do not even seem to support custom events. It did however remind me of a very useful event that is often overlooked and fits rather well alongside connection string logging, the Execute SQL Task’s custom ExecuteSQLExecutingQuery event. To quote the help reference Custom Messages for Logging - Provides information about the execution phases of the SQL statement. Log entries are written when the task acquires connection to the database, when the task starts to prepare the SQL statement, and after the execution of the SQL statement is completed. The log entry for the prepare phase includes the SQL statement that the task uses. It is the last part that is so useful, how often have you used an expression to derive a SQL statement and you want to log that to make sure the correct SQL is being returned? You need to turn it one, by default no custom log events are captured, but I’ll refer you to a walkthrough on setting up the logging for ExecuteSQLExecutingQuery by Jamie.

    Read the article

< Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >