Search Results

Search found 15926 results on 638 pages for 'business card'.

Page 74/638 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • IBM x260m won't boot from PCI SATA card

    - by syrenity
    Hi. We ghosted our old Windows 2003 drive to a new 1TB one, and connected through new PCI SATA card. The x260m server fails to boot with error 1962: "Boot sector not found." When we connect the drive directly to motherboard, it boots fine, so the problem is probably in getting BIOS to boot from the SATA card. Has anyone encountered and solved such issue? EDIT: We tried disabling the on-board Sata controller (Hostraid/SAS?), but there is no such option - only enable, enhanced and compatible. Tried also to chance boot device priority to all possible choices - no luck. Thanks in advance!

    Read the article

  • Microsoft Ergonomic Keyboards With Card Readers?

    - by Steve
    When I started working at my current job I developed tendinitis in my wrists. Luckily that cleared up when I started using a Microsoft ergonomic keyboard. The problem is that where I work is moving to more security. We will need to stick a card into a slot to log into our PCs. They bought a bunch of new keyboards with these slots built in. All regular keyboards. Is there something like the Microsoft Ergonomic keyboard that comes with such a card slot? Thanks.

    Read the article

  • Apple Airport Extreme Wireless Card M8881 Connectivity Issue

    - by Carlosfocker
    I bought a Apple Airport Extreme Wireless Card M8881 with the antenna for my old Power Mac G5 1.6 Ghz. The computer is running OS X 10.4. The card works and I can connect to my dlink 802.11g wireless router using WPA2/AES encryption. The problem is that the connection has intermittent connectivity issues. When the issue occurs the signal bars for the connection drop to one or two bars and the performance of the connection suffers greatly. The computer isn't far from the wireless router. I have a laptop at the desk where the mac is and it does not have any issues. I'm not sure what it might be? Updated drivers? Firmware? Let me know if I forgot any important information.

    Read the article

  • Macbook Pro (SantaRosa) internal display not detected by graphics card, external monitor OK

    - by BLAU
    My MacBook Pro (2.4 Ghz, Santa Rosa with infamous nVidia card) acts strange. It shows the normal gray screen with Apple logo and animation flawlessly during start up but the internal display goes black without any rendering at all when all is loaded. (shining a light on display show nothing) If an external monitor is connected through the DVI port it will remain black during start up and then show the desktop as the internal display goes black. This happens both while booting to Mountain Lion and Windows XP. I have checked "About my Mac" and only the external display is listed. The same is the case if I use the nVidia Control panel in Windows XP. My questions: Is this a hardware problem or is it related to software maybe even firmware? What controls the display during start up, graphics card or something else?

    Read the article

  • external video card for notebook?

    - by Fuxi
    hi, i'd like to hook an external monitor to my notebook as i need a higher resolution (1680x1050). my notebook supports that resolution (via analog vga out) but the quality is pretty awful - there are blurry horizontal shadows everywhere and tweaking didn't help - seems like the vga out's quality isn't good enough. so i was wondering - does it make sense looking for an external video card? can i also connect a dvi monitor via usb-video-card with optimal quality? my notebook also has pcmcia. any recommendations which i should buy? thx

    Read the article

  • Best use of new express card on Windows

    - by jckdnk111
    I just bought a 48GB SSD express card for my laptop and I am trying to decide how best to use it. I will be running some sort of virtualization (prob VirtualBox) to test / learn Windows Server administration. I am running Windows 7 Ultimate 64 bit. I have 4GB of RAM and a 7200 RPM SATA hard disk. The express card will read at 115MB/s and write at 65MB/s. So how best to use this new disk? Readyboost, relocate pagefile, store VM disks, mix / match?

    Read the article

  • Share the DVB card on windows 7 [closed]

    - by Bashar Kernel
    I have 2 computers connected to a router and I have a DVB card in one of them. I want to use the one DVB card to feed both of them. I read about it and I know that I want to share the DVB adapter with the Internet Connection Sharing on the LAN network. But when I use the connection sharing, I lose my internet access I tried to use "Bridge Connection", but then I also lost my internet access too. Can any one tell me how to fix this problem? And how to view the channels (for example how to use the VLC)?

    Read the article

  • Issues with new graphics card and drivers

    - by Ortund
    With my onboard graphics card (Intel HD 4000) I was able to use 1680x1050 screen resolution. Using the same screen and an Asus ENGTS250 graphics card with 296.10 drivers installed, I'm not able to use that resolution anymore. Also, I've been having other trouble with driver installations. Using 340.52 or 337.88, I can't boot into Windows 7. I see the Windows Welcome screen and then the screen goes black, loses signal and the computer locks up. I think the drivers issues and the resolution problems go hand-in-hand. I have a feeling I could use a newer driver than 296.10 which would give me access to use the 1680x1050 resolution I want to use, but I'm terrified that if I update or install another driver, that the computer will lock up again. Can anyone recommend a [better] driver for me to use?

    Read the article

  • Is Ubuntu recognizing and/or using my NVIDIA graphics card?

    - by user212860
    This is my first post here, and I'm pretty new to Ubuntu/Linux. I currently have no other OS except for Ubuntu 13.10. (I used to have Win7 until i got a new terabyte hard drive). My current PC build, if any of this helps: CPU: Intel i5 quad-core Graphics: NVIDIA GeForce GTX 650 RAM: 8 GB HDD: 1 TB SATA 3 Motherboard: MSi Z77 A-G41 OS Ubuntu 13.10 So I recently installed Ubuntu 13.10 and put Steam on it, and I'm seeing that my games run a lot slower than they did when I had Win7. I figured it was a graphics problem, so I checked System Settings Details Overview. It says in "Graphics" that I have "Gallium 0.4 on NVE7" (don't really know what that is). Does this mean that Ubuntu is not using my graphics card? In System Settings Software & Updates Additional Drivers, it clearly shows like this: NVIDIA Corporation: GK107 [GeForce GTX 650] -This device is using an alternative driver (And then it shows a list of drivers that I can switch back and forth to) So this is a bit confusing. In Software and Updates, it clearly shows that I have my NVIDIA card installed, and that I have a driver selected for it. But in System Settings, it shows I have some Gallium 0.4 thing. I had done a bit of research, and ended up typing command: "lspci|grep VGA" in the Terminal. It showed this in response: VGA compatible controller: NVIDIA Corporation GK107 [GeForce GTX 650] (rev a1) The Terminal seems to recognize my graphics card. What it looks like to me, is that I don't have the proper driver, and I might be using my CPU's integrated graphics. When I switch around which driver I am using in that list, it still does not see my card in System Settings. Some of the drivers in the list give me some sort of OpenGL error when I try to run a game. It might just be that my games are running slow because the game developers have not optimized it for Ubuntu that well. However, that still doesn't take away from the fact that System Settings is not showing my NVIDIA card. TL;DR Version: How do I know if my video card is being recognized/used? If my video card is not being used, what is the best way fix that? Please make your answers easy to understand. I do not mind wordy responses, as long as I can follow what you're saying. Any help would be greatly appreciated! Thanks, Jabber5

    Read the article

  • Rapidly Deploy Oracle Applications with Oracle VM Templates

    - by monica.kumar
    Oracle today announced Oracle VM Templates for a number of Oracle Applications including Oracle E-Business Suite 12.1 Oracle's JD Edwards Enterprise One 9.0 Oracle's PeopleSoft 9.1 These Oracle VM Templates, based on Oracle Enterprise Linux, provide pre-installed and pre-configured enterprise software images that help eliminate the need to install new software from scratch, offering customers a time-saving approach to deploying a fully configured software stack. Learn more about Oracle VM Templates

    Read the article

  • No HDMI Audio with GeForce 9600GT and nForce board

    - by Bobby
    I've been trying to get HDMI with sound working for the last few days, and I'm a little bit out of ideas. (I've verified that the hardware/Setup works via Windows.) aplay does not list my HDMI device: $ aplay -l **** List of PLAYBACK Hardware Devices **** card 0: NVidia [HDA NVidia], device 0: ALC662 rev1 Analog [ALC662 rev1 Analog] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: NVidia [HDA NVidia], device 1: ALC662 rev1 Digital [ALC662 rev1 Digital] Subdevices: 1/1 Subdevice #0: subdevice #0 I've already compiled the alsa drivers (1.0.24) from a snapshot (with --with-oss=no) and added the line options snd-hda-intel model=auto # Tried 3stack-dig and 6stack-dig too to /etc/modprobe.d/alsa-base.conf. Still, the device does not show up. If it is important, the HDMI TV is at the moment not configured to be part of the X session (I've tried that to, at least with X restart, and it didn't change anything). What did I miss? $ lspci 00:00.0 Host bridge: nVidia Corporation Device 07c3 (rev a2) 00:00.1 RAM memory: nVidia Corporation nForce 630i memory controller (rev a2) 00:01.0 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:01.1 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:01.2 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:01.3 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:01.4 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:01.5 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:01.6 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:02.0 RAM memory: nVidia Corporation nForce 630i memory controller (rev a1) 00:03.0 ISA bridge: nVidia Corporation MCP73 LPC Bridge (rev a2) 00:03.1 SMBus: nVidia Corporation MCP73 SMBus (rev a1) 00:03.2 RAM memory: nVidia Corporation MCP73 Memory Controller (rev a1) 00:03.4 RAM memory: nVidia Corporation MCP73 Memory Controller (rev a1) 00:04.0 USB Controller: nVidia Corporation GeForce 7100/nForce 630i USB (rev a1) 00:04.1 USB Controller: nVidia Corporation MCP73 [nForce 630i] USB 2.0 Controller (EHCI) (rev a1) 00:08.0 IDE interface: nVidia Corporation MCP73 IDE (rev a1) 00:09.0 Audio device: nVidia Corporation MCP73 High Definition Audio (rev a1) 00:0a.0 PCI bridge: nVidia Corporation MCP73 PCI Express bridge (rev a1) 00:0b.0 PCI bridge: nVidia Corporation MCP73 PCI Express bridge (rev a1) 00:0c.0 PCI bridge: nVidia Corporation MCP73 PCI Express bridge (rev a1) 00:0d.0 PCI bridge: nVidia Corporation MCP73 PCI Express bridge (rev a1) 00:0e.0 IDE interface: nVidia Corporation MCP73 IDE (rev a2) 00:0f.0 Ethernet controller: nVidia Corporation MCP73 Ethernet (rev a2) 02:00.0 VGA compatible controller: nVidia Corporation G94 [GeForce 9600 GT] (rev a1)   $ aplay -L default pulse Playback/recording through the PulseAudio sound server front:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog Front speakers surround40:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog 4.0 Surround output to Front and Rear speakers surround41:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog 4.1 Surround output to Front, Rear and Subwoofer speakers surround50:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog 5.0 Surround output to Front, Center and Rear speakers surround51:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog 5.1 Surround output to Front, Center, Rear and Subwoofer speakers surround71:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog 7.1 Surround output to Front, Center, Side, Rear and Woofer speakers iec958:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Digital IEC958 (S/PDIF) Digital Audio Output dmix:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog Direct sample mixing device dmix:CARD=NVidia,DEV=1 HDA NVidia, ALC662 rev1 Digital Direct sample mixing device dsnoop:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog Direct sample snooping device dsnoop:CARD=NVidia,DEV=1 HDA NVidia, ALC662 rev1 Digital Direct sample snooping device hw:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog Direct hardware device without any conversions hw:CARD=NVidia,DEV=1 HDA NVidia, ALC662 rev1 Digital Direct hardware device without any conversions plughw:CARD=NVidia,DEV=0 HDA NVidia, ALC662 rev1 Analog Hardware device with all software conversions plughw:CARD=NVidia,DEV=1 HDA NVidia, ALC662 rev1 Digital Hardware device with all software conversions

    Read the article

  • Getting It Right The First Time

    - by andyleonard
    Introduction This post is the seventeenth part of a ramble-rant about the software business. The current posts in this series are: Goodwill, Negative and Positive Visions, Quests, Missions Right, Wrong, and Style Follow Me Balance, Part 1 Balance, Part 2 Definition of a Great Team The 15-Minute Meeting Metaproblems: Drama The Right Question Software is Organic, Part 1 Metaproblem: Terror I Don't Work On My Car A Turning Point Human Doings Everything Changes This post is about getting software right...(read more)

    Read the article

  • SQL SERVER – Guest Post – Architecting Data Warehouse – Niraj Bhatt

    - by pinaldave
    Niraj Bhatt works as an Enterprise Architect for a Fortune 500 company and has an innate passion for building / studying software systems. He is a top rated speaker at various technical forums including Tech·Ed, MCT Summit, Developer Summit, and Virtual Tech Days, among others. Having run a successful startup for four years Niraj enjoys working on – IT innovations that can impact an enterprise bottom line, streamlining IT budgets through IT consolidation, architecture and integration of systems, performance tuning, and review of enterprise applications. He has received Microsoft MVP award for ASP.NET, Connected Systems and most recently on Windows Azure. When he is away from his laptop, you will find him taking deep dives in automobiles, pottery, rafting, photography, cooking and financial statements though not necessarily in that order. He is also a manager/speaker at BDOTNET, Asia’s largest .NET user group. Here is the guest post by Niraj Bhatt. As data in your applications grows it’s the database that usually becomes a bottleneck. It’s hard to scale a relational DB and the preferred approach for large scale applications is to create separate databases for writes and reads. These databases are referred as transactional database and reporting database. Though there are tools / techniques which can allow you to create snapshot of your transactional database for reporting purpose, sometimes they don’t quite fit the reporting requirements of an enterprise. These requirements typically are data analytics, effective schema (for an Information worker to self-service herself), historical data, better performance (flat data, no joins) etc. This is where a need for data warehouse or an OLAP system arises. A Key point to remember is a data warehouse is mostly a relational database. It’s built on top of same concepts like Tables, Rows, Columns, Primary keys, Foreign Keys, etc. Before we talk about how data warehouses are typically structured let’s understand key components that can create a data flow between OLTP systems and OLAP systems. There are 3 major areas to it: a) OLTP system should be capable of tracking its changes as all these changes should go back to data warehouse for historical recording. For e.g. if an OLTP transaction moves a customer from silver to gold category, OLTP system needs to ensure that this change is tracked and send to data warehouse for reporting purpose. A report in context could be how many customers divided by geographies moved from sliver to gold category. In data warehouse terminology this process is called Change Data Capture. There are quite a few systems that leverage database triggers to move these changes to corresponding tracking tables. There are also out of box features provided by some databases e.g. SQL Server 2008 offers Change Data Capture and Change Tracking for addressing such requirements. b) After we make the OLTP system capable of tracking its changes we need to provision a batch process that can run periodically and takes these changes from OLTP system and dump them into data warehouse. There are many tools out there that can help you fill this gap – SQL Server Integration Services happens to be one of them. c) So we have an OLTP system that knows how to track its changes, we have jobs that run periodically to move these changes to warehouse. The question though remains is how warehouse will record these changes? This structural change in data warehouse arena is often covered under something called Slowly Changing Dimension (SCD). While we will talk about dimensions in a while, SCD can be applied to pure relational tables too. SCD enables a database structure to capture historical data. This would create multiple records for a given entity in relational database and data warehouses prefer having their own primary key, often known as surrogate key. As I mentioned a data warehouse is just a relational database but industry often attributes a specific schema style to data warehouses. These styles are Star Schema or Snowflake Schema. The motivation behind these styles is to create a flat database structure (as opposed to normalized one), which is easy to understand / use, easy to query and easy to slice / dice. Star schema is a database structure made up of dimensions and facts. Facts are generally the numbers (sales, quantity, etc.) that you want to slice and dice. Fact tables have these numbers and have references (foreign keys) to set of tables that provide context around those facts. E.g. if you have recorded 10,000 USD as sales that number would go in a sales fact table and could have foreign keys attached to it that refers to the sales agent responsible for sale and to time table which contains the dates between which that sale was made. These agent and time tables are called dimensions which provide context to the numbers stored in fact tables. This schema structure of fact being at center surrounded by dimensions is called Star schema. A similar structure with difference of dimension tables being normalized is called a Snowflake schema. This relational structure of facts and dimensions serves as an input for another analysis structure called Cube. Though physically Cube is a special structure supported by commercial databases like SQL Server Analysis Services, logically it’s a multidimensional structure where dimensions define the sides of cube and facts define the content. Facts are often called as Measures inside a cube. Dimensions often tend to form a hierarchy. E.g. Product may be broken into categories and categories in turn to individual items. Category and Items are often referred as Levels and their constituents as Members with their overall structure called as Hierarchy. Measures are rolled up as per dimensional hierarchy. These rolled up measures are called Aggregates. Now this may seem like an overwhelming vocabulary to deal with but don’t worry it will sink in as you start working with Cubes and others. Let’s see few other terms that we would run into while talking about data warehouses. ODS or an Operational Data Store is a frequently misused term. There would be few users in your organization that want to report on most current data and can’t afford to miss a single transaction for their report. Then there is another set of users that typically don’t care how current the data is. Mostly senior level executives who are interesting in trending, mining, forecasting, strategizing, etc. don’t care for that one specific transaction. This is where an ODS can come in handy. ODS can use the same star schema and the OLAP cubes we saw earlier. The only difference is that the data inside an ODS would be short lived, i.e. for few months and ODS would sync with OLTP system every few minutes. Data warehouse can periodically sync with ODS either daily or weekly depending on business drivers. Data marts are another frequently talked about topic in data warehousing. They are subject-specific data warehouse. Data warehouses that try to span over an enterprise are normally too big to scope, build, manage, track, etc. Hence they are often scaled down to something called Data mart that supports a specific segment of business like sales, marketing, or support. Data marts too, are often designed using star schema model discussed earlier. Industry is divided when it comes to use of data marts. Some experts prefer having data marts along with a central data warehouse. Data warehouse here acts as information staging and distribution hub with spokes being data marts connected via data feeds serving summarized data. Others eliminate the need for a centralized data warehouse citing that most users want to report on detailed data. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Best Practices, Business Intelligence, Data Warehousing, Database, Pinal Dave, PostADay, Readers Contribution, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Tellago announces SQL Server 2008 R2 BI quick adoption programs

    - by gsusx
    During the last year, we (Tellago) have been involved in various business intelligence initiatives that leverage some emerging BI techniques such as self-service BI or complex event processing (CEP). Specifically, in the last few months, we have partnered with Microsoft to deliver a series of events across the country where we present the different technologies of the SQL Server 2008 R2 BI stack such as PowerPivot, StreamInsight, Ad-Hoc Reporting and Master Data Services. As part of those events...(read more)

    Read the article

  • SAP dévoile Business Object 4.0, la nouvelle version de sa solution BI intègre la mobilité, les réseaux sociaux et le « in-memory »

    SAP dévoile Business Object 4.0 La nouvelle version de sa solution BI intègre la mobilité, les réseaux sociaux et le « in-memory » SAP vient de dévoiler Business Object 4.0, la prochaine version de sa plate-forme de nouvelle génération de Business Intelligence et de Gestion d'Information d'Entreprise (EIM). [IMG]http://ftp-developpez.com/gordon-fowler/SAP/Slide-5-SAP-BusinessObjects-4.0-Event-Insight2.jpg[/IMG] Après SAP ByDesign 2.6, sa suite ERP en mode SaaS (qui arrive avec un tout nouveau SDK), Business Object 4.0 est la deuxième très grosse annonce de cette année 2011 que Nicolas Sekkaki, Direc...

    Read the article

  • Letölthetoek a HOUG 2010 Konferencia eloadásai

    - by Fekete Zoltán
    2010. március 22-24. között zajlott le a HOUG Konferencia 2010. Már letölthetoek az eloadás anyagok a http://www.houg.hu/ oldalról az Archívum-ra, majd a HOUG 2010-re kattintva. A konferencián készült fényképek még nem kerültek fel, de reménykedjünk, hogy kisvártatva elénk tárulnak. :) Az Üzleti intelligencia és adattárház szekció (Business Intelligence & Data Warehouse) eloadásai itt érheto el. Jó mazsolázást kívánok!

    Read the article

  • Oracle AIM, Oracle ABF, and Siebel Results Roadmap Officially Retired as of January 31, 2011

    - by tom.spitz
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} It seems somehow appropriate that the first entry of the Oracle® Unified Method (OUM) blog is about the retirement of several of our legacy methods, most notably AIM Foundation.If you're reading this, you're probably aware that Oracle has been developing OUM to support the entire Enterprise IT lifecycle, including support for the successful implementation of every Oracle product. As Oracle has continued to acquire new companies and technologies, it has become essential that we also create a single, unified language and approach for implementation - across the Oracle ecosystem.With the release of OUM 5.1 in 2009, OUM provided full support for all enterprise application implementation projects including Oracle E-Business Suite R12, Siebel CRM, PeopleSoft Enterprise, and JD Edwards EnterpriseOne projects. In 2010, we released OUM training that supports the use of OUM on these types of projects.That support represented a major milestone in the evolution of OUM and enabled implementers to transition to OUM. Consequently, we announced a staggered retirement schedule for Oracle's legacy methods. On January 31, 2011 we announced the retirement of:Oracle Application Implementation Method (AIM)Oracle AIM for Business Flows (ABF)Siebel Results RoadmapLater this year, we will announce the retirement of Compass - the legacy PeopleSoft method - and Data Warehouse Method Fast Track.OUM is available free of charge to Oracle Gold, Platinum, and Diamond partners through the Oracle Partner Network (OPN) [OUM on OPN]. The OUM Customer Program allows customers to obtain copies of the method for their internal use by contracting with Oracle for an engagement of two weeks or longer meeting some additional minimum criteria.There be more retirement announcements in the coming months. For now it's "Adios AIM." Thanks for the memories...

    Read the article

  • My Name is Andy and I Have an Associates of Applied Science Degree

    - by andyleonard
    Introduction This post is the forty-first part of a ramble-rant about the software business. The current posts in this series can be found on the series landing page . This post is about the education requirements for job listings. Bachelor’s Degree Required I’m not qualified for any job that requires a bachelor’s degree because I don’t have one. I regularly receive email from recruiters asking if I’m available and interested in a position they have open, or a position they’ve been asked to fill...(read more)

    Read the article

  • One-Time Boosts

    - by andyleonard
    Introduction This post is the eighteenth part of a ramble-rant about the software business. The current posts in this series are: Goodwill, Negative and Positive Visions, Quests, Missions Right, Wrong, and Style Follow Me Balance, Part 1 Balance, Part 2 Definition of a Great Team The 15-Minute Meeting Metaproblems: Drama The Right Question Software is Organic, Part 1 Metaproblem: Terror I Don't Work On My Car A Turning Point Human Doings Everything Changes Getting It Right The First Time This post...(read more)

    Read the article

  • Oracle BI adminisztráció és dokumentáció

    - by Fekete Zoltán
    Felmerült a kérdés, hogyan lehet telepíteni az Oracle Business Intelligence csomagok (BI EE, BI SE One) adminisztrációs eszközeit? Maga a BI végfelhasználói felület webes, böngészonket használva tudjuk használni az integrált elemeket: - interaktív irányítópultokat (dashboard) - ad-hoc (eseti) elemzések - jelentések, kimutatások, riportok - riasztások, értesítések - vezetett elemzések, folyamatok,... Az adminisztrátori eszközök egy része kliensként telepítendo a windows-os kliens gépekre, azaz a BI EE telepíto készletet windows-os változatában érhetok el. Az Oracle BI dokumentáció itt olvasható és töltheto le, közte az adminisztrációs dokumentum is,

    Read the article

  • How important is responsive web design?

    - by Daniel
    I've heard many different opinions regarding the pros and cons of responsive web design recently and was wondering whether it was necessary for small businesses that target small geographical areas to implement it? Some sub-questions I have relating to this include: Is it better to use responsive web design as opposed to having separate code utilized for different dimensions/devices? Can it affect SEO (positively or negatively)? What are the main problems I could run into when optimizing a website for a business using this design method?

    Read the article

  • How accurate is "Business logic should be in a service, not in a model"?

    - by Jeroen Vannevel
    Situation Earlier this evening I gave an answer to a question on StackOverflow. The question: Editing of an existing object should be done in repository layer or in service? For example if I have a User that has debt. I want to change his debt. Should I do it in UserRepository or in service for example BuyingService by getting an object, editing it and saving it ? My answer: You should leave the responsibility of mutating an object to that same object and use the repository to retrieve this object. Example situation: class User { private int debt; // debt in cents private string name; // getters public void makePayment(int cents){ debt -= cents; } } class UserRepository { public User GetUserByName(string name){ // Get appropriate user from database } } A comment I received: Business logic should really be in a service. Not in a model. What does the internet say? So, this got me searching since I've never really (consciously) used a service layer. I started reading up on the Service Layer pattern and the Unit Of Work pattern but so far I can't say I'm convinced a service layer has to be used. Take for example this article by Martin Fowler on the anti-pattern of an Anemic Domain Model: There are objects, many named after the nouns in the domain space, and these objects are connected with the rich relationships and structure that true domain models have. The catch comes when you look at the behavior, and you realize that there is hardly any behavior on these objects, making them little more than bags of getters and setters. Indeed often these models come with design rules that say that you are not to put any domain logic in the the domain objects. Instead there are a set of service objects which capture all the domain logic. These services live on top of the domain model and use the domain model for data. (...) The logic that should be in a domain object is domain logic - validations, calculations, business rules - whatever you like to call it. To me, this seemed exactly what the situation was about: I advocated the manipulation of an object's data by introducing methods inside that class that do just that. However I realize that this should be a given either way, and it probably has more to do with how these methods are invoked (using a repository). I also had the feeling that in that article (see below), a Service Layer is more considered as a façade that delegates work to the underlying model, than an actual work-intensive layer. Application Layer [his name for Service Layer]: Defines the jobs the software is supposed to do and directs the expressive domain objects to work out problems. The tasks this layer is responsible for are meaningful to the business or necessary for interaction with the application layers of other systems. This layer is kept thin. It does not contain business rules or knowledge, but only coordinates tasks and delegates work to collaborations of domain objects in the next layer down. It does not have state reflecting the business situation, but it can have state that reflects the progress of a task for the user or the program. Which is reinforced here: Service interfaces. Services expose a service interface to which all inbound messages are sent. You can think of a service interface as a façade that exposes the business logic implemented in the application (typically, logic in the business layer) to potential consumers. And here: The service layer should be devoid of any application or business logic and should focus primarily on a few concerns. It should wrap Business Layer calls, translate your Domain in a common language that your clients can understand, and handle the communication medium between server and requesting client. This is a serious contrast to other resources that talk about the Service Layer: The service layer should consist of classes with methods that are units of work with actions that belong in the same transaction. Or the second answer to a question I've already linked: At some point, your application will want some business logic. Also, you might want to validate the input to make sure that there isn't something evil or nonperforming being requested. This logic belongs in your service layer. "Solution"? Following the guidelines in this answer, I came up with the following approach that uses a Service Layer: class UserController : Controller { private UserService _userService; public UserController(UserService userService){ _userService = userService; } public ActionResult MakeHimPay(string username, int amount) { _userService.MakeHimPay(username, amount); return RedirectToAction("ShowUserOverview"); } public ActionResult ShowUserOverview() { return View(); } } class UserService { private IUserRepository _userRepository; public UserService(IUserRepository userRepository) { _userRepository = userRepository; } public void MakeHimPay(username, amount) { _userRepository.GetUserByName(username).makePayment(amount); } } class UserRepository { public User GetUserByName(string name){ // Get appropriate user from database } } class User { private int debt; // debt in cents private string name; // getters public void makePayment(int cents){ debt -= cents; } } Conclusion All together not much has changed here: code from the controller has moved to the service layer (which is a good thing, so there is an upside to this approach). However this doesn't look like it had anything to do with my original answer. I realize design patterns are guidelines, not rules set in stone to be implemented whenever possible. Yet I have not found a definitive explanation of the service layer and how it should be regarded. Is it a means to simply extract logic from the controller and put it inside a service instead? Is it supposed to form a contract between the controller and the domain? Should there be a layer between the domain and the service layer? And, last but not least: following the original comment Business logic should really be in a service. Not in a model. Is this correct? How would I introduce my business logic in a service instead of the model?

    Read the article

  • T-SQL Tuesday: Personality Clashes, Style Collisions, and Differences of Opinion

    - by andyleonard
    This post is the twenty-sixth part of a ramble-rant about the software business. The current posts in this series are: Goodwill, Negative and Positive Visions, Quests, Missions Right, Wrong, and Style Follow Me Balance, Part 1 Balance, Part 2 Definition of a Great Team The 15-Minute Meeting Metaproblems: Drama The Right Question Software is Organic, Part 1 Metaproblem: Terror I Don't Work On My Car A Turning Point Human Doings Everything Changes Getting It Right The First Time One-Time Boosts Institutionalized!...(read more)

    Read the article

  • Constraint-based expert systems design and development [on hold]

    - by Alex B.
    I would appreciate some recommendations & resources on design and development of expert systems, in particular, knowledge-based & constraint-based (not recommendation) systems. Ideally, your answers should consider the perspective (context) of using a SaaS business model and open source rules engine. How would you advise to address performance, scalability and other architectural criteria? Any other considerations on undertaking such project will be appreciated. Thanks much in advance!

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >