Search Results

Search found 15373 results on 615 pages for 'delphi rulez 2010'.

Page 240/615 | < Previous Page | 236 237 238 239 240 241 242 243 244 245 246 247  | Next Page >

  • I need to understand why my server turned off

    - by Dema
    Our organization was robbed and definitely it was inside job. I was set up. I work as a manager and as system administrator in this organization and everything goes against me. The only clue I have is that someone accidentally or intentionally turned of a server that is in the office indicating that some one was inside at the time that no one should be. This is the only evidence I have that can justify me.  I looked the log files and they show that the Power button was pressed. Can you help me to find out that that was not a bug or systems overheat? I will post the log files and if you will ask more I will gladly provide the information. Messages: Dec 24 21:43:14 jamx shutdown[27883]: shutting down for system halt Dec 24 21:43:15 jamx init: Switching to runlevel: 0 Dec 24 21:43:15 jamx smartd[3047]: smartd received signal 15: Terminated Dec 24 21:43:15 jamx smartd[3047]: smartd is exiting (exit status 0) Dec 24 21:43:15 jamx avahi-daemon[3015]: Got SIGTERM, quitting. Dec 24 21:43:15 jamx avahi-daemon[3015]: Leaving mDNS multicast group on interface eth0.IPv6 with address fe80::221:85ff:fe11:8221. Dec 24 21:43:15 jamx avahi-daemon[3015]: Leaving mDNS multicast group on interface eth0.IPv4 with address 82.207.41.239. Dec 24 21:43:15 jamx shutdown[27962]: shutting down for system halt Dec 24 21:43:15 jamx saslauthd[2983]: server_exit     : master exited: 2983 Dec 24 21:43:29 jamx nmbd[2921]: [2010/12/24 21:43:29, 0] nmbd/nmbd.c:terminate(58) Dec 24 21:43:29 jamx nmbd[2921]:   Got SIGTERM: going down... Dec 24 21:43:31 jamx clamd[2526]: Pid file removed. Dec 24 21:43:31 jamx clamd[2526]: --- Stopped at Fri Dec 24 21:43:31 2010 Dec 24 21:43:31 jamx clamd[2526]: Socket file removed. Dec 24 21:43:31 jamx mydns[2645]: jamx.org.ua up 9h44m48s (35088s) 117 questions (0/s) NOERROR=117 SERVFAIL=0 NXDOMAIN=0 NOTIMP=0 REFUSED=0 (100% TCP, 117 queries) Dec 24 21:43:31 jamx mydns[2645]: terminated Dec 24 21:43:34 jamx ntpd[2512]: ntpd exiting on signal 15 Dec 24 21:43:34 jamx hcid[2265]: Got disconnected from the system message bus Dec 24 21:43:35 jamx rpc.statd[2167]: Caught signal 15, un-registering and exiting. Dec 24 21:43:35 jamx portmap[28473]: connect from 127.0.0.1 to unset(status): request from unprivileged port Dec 24 21:43:35 jamx auditd[2021]: The audit daemon is exiting. Dec 24 21:43:35 jamx kernel: audit(1293219815.505:4044): audit_pid=0 old=2021 by auid=4294967295 Dec 24 21:43:35 jamx pcscd: pcscdaemon.c:572:signal_trap() Preparing for suicide Dec 24 21:43:36 jamx pcscd: hotplug_libusb.c:376:HPRescanUsbBus() Hotplug stopped Dec 24 21:43:36 jamx pcscd: readerfactory.c:1379:RFCleanupReaders() entering cleaning function Dec 24 21:43:36 jamx pcscd: pcscdaemon.c:532:at_exit() cleaning /var/run Dec 24 21:43:36 jamx kernel: Kernel logging (proc) stopped. Dec 24 21:43:36 jamx kernel: Kernel log daemon terminating. Dec 24 21:43:37 jamx exiting on signal 15 Acpid: [Fri Dec 24 21:43:14 2010] received event "button/power PWRF 00000080 00000001" [Fri Dec 24 21:43:14 2010] notifying client 2382[68:68] [Fri Dec 24 21:43:14 2010] executing action "/bin/ps awwux | /bin/grep gnome-power-manager | /bin/grep -qv grep || /sbin/shutdown -h now" [Fri Dec 24 21:43:14 2010] BEGIN HANDLER MESSAGES [Fri Dec 24 21:43:15 2010] END HANDLER MESSAGES [Fri Dec 24 21:43:15 2010] action exited with status 0 [Fri Dec 24 21:43:15 2010] completed event "button/power PWRF 00000080 00000001" [Fri Dec 24 21:43:15 2010] received event "button/power PWRF 00000080 00000002" [Fri Dec 24 21:43:15 2010] notifying client 2382[68:68] [Fri Dec 24 21:43:15 2010] executing action "/bin/ps awwux | /bin/grep gnome-power-manager | /bin/grep -qv grep || /sbin/shutdown -h now" [Fri Dec 24 21:43:15 2010] BEGIN HANDLER MESSAGES [Fri Dec 24 21:43:15 2010] END HANDLER MESSAGES [Fri Dec 24 21:43:15 2010] action exited with status 0 [Fri Dec 24 21:43:15 2010] completed event "button/power PWRF 00000080 00000002" [Fri Dec 24 21:43:34 2010] exiting

    Read the article

  • What causes the Openid error: Received "invalidate_handle" from server

    - by BryanWheelock
    I'm new to openid, and I am getting an "invalidate_handle" and I have no idea what to do to fix it. I'm using django_authopenid [Thu Apr 29 14:13:28 2010] [error] Generated checkid_setup request to https://www.google.com/accounts/o8/ud with assocication AOxxxxxxxxOX5-V9oDc3-btHhFxzAcccccccccc2RTHgh [Thu Apr 29 14:13:29 2010] [error] Error attempting to use stored discovery information: <openid.consumer.consumer.TypeURIMismatch: Required type http://specs.openid.net/auth/2.0/signon not found in ['http://specs.openid.net/auth/2.0/server', 'http://openid.net/srv/ax/1.0', 'http://specs.openid.net/extensions/ui/1.0/mode/popup', 'http://specs.openid.net/extensions/ui/1.0/icon', 'http://specs.openid.net/extensions/pape/1.0'] for endpoint <openid.consumer.discover.OpenIDServiceEndpoint server_url='https://www.google.com/accounts/o8/ud' claimed_id=None local_id=None canonicalID=None used_yadis=True >> [Thu Apr 29 14:13:29 2010] [error] Attempting discovery to verify endpoint [Thu Apr 29 14:13:29 2010] [error] Performing discovery on https://www.google.com/accounts/o8/id?id=AOxxxxxxxxOX5-V9oDc3-btHhFxzAcccccccccc2RTHgh [Thu Apr 29 14:13:29 2010] [error] Received id_res response from https://www.google.com/accounts/o8/ud using association AOxxxxxxxxOX5-V9oDc3-btHhFxzAcccccccccc2RTHgh [Thu Apr 29 14:13:29 2010] [error] Using OpenID check_authentication [Thu Apr 29 14:13:29 2010] [error] op_endpoint [Thu Apr 29 14:13:29 2010] [error] claimed_id [Thu Apr 29 14:13:29 2010] [error] identity [Thu Apr 29 14:13:29 2010] [error] return_to [Thu Apr 29 14:13:29 2010] [error] response_nonce [Thu Apr 29 14:13:29 2010] [error] assoc_handle [Thu Apr 29 14:13:29 2010] [error] Received "invalidate_handle" from server https://www.google.com/accounts/o8/ud

    Read the article

  • django+mod_wsgi on virtualenv not working

    - by jwesonga
    I've just finished setting up a django app on virtualenv, deployment went smoothly using a fabric script, but now the .wsgi is not working, I've tried every variation on the internet but no luck. My .wsgi file is: import os import sys import django.core.handlers.wsgi # put the Django project on sys.path root_path = os.path.abspath(os.path.dirname(__file__) + '../') sys.path.insert(0, os.path.join(root_path, 'kcdf')) sys.path.insert(0, root_path) os.environ['DJANGO_SETTINGS_MODULE'] = 'kcdf.settings' application = django.core.handlers.wsgi.WSGIHandler() I keep getting the same error: [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] mod_wsgi (pid=16938): Exception occurred processing WSGI script '/home/kcdfweb/webapps/kcdf.web/releases/current/kcdf/apache/kcdf.wsgi'. [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] Traceback (most recent call last): [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/core/handlers/wsgi.py", line 230, in __call__ [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] self.load_middleware() [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/core/handlers/base.py", line 33, in load_middleware [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] for middleware_path in settings.MIDDLEWARE_CLASSES: [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/utils/functional.py", line 269, in __getattr__ [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] self._setup() [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/conf/__init__.py", line 40, in _setup [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] self._wrapped = Settings(settings_module) [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/conf/__init__.py", line 75, in __init__ [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] raise ImportError, "Could not import settings '%s' (Is it on sys.path? Does it have syntax errors?): %s" % (self.SETTINGS_MODULE, e) [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] ImportError: Could not import settings 'kcdf.settings' (Is it on sys.path? Does it have syntax errors?): No module named kcdf.settings my virtual environment is on /home/user/webapps/kcdfweb my app is /home/user/webapps/kcdf.web/releases/current/project_name my wsgi file home/user/webapps/kcdf.web/releases/current/project_name/apache/project_name.wsgi

    Read the article

  • Trying to get django app to work with mod_wsgi on CentOS 5

    - by David
    I'm running CentOS 5, and am trying to get a django application working with mod_wsgi. I'm using .wsgi settings I got working on Ubuntu. Here is the error: [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] SystemError: dynamic module not initialized properly [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] mod_wsgi (pid=23630): Target WSGI script '/data/hosting/cubedev/apache/django.wsgi' cannot be loaded as Python module. [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] mod_wsgi (pid=23630): Exception occurred processing WSGI script '/data/hosting/cubedev/apache/django.wsgi'. [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] Traceback (most recent call last): [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/data/hosting/cubedev/apache/django.wsgi", line 8, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] import django.core.handlers.wsgi [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/opt/python2.6/lib/python2.6/site-packages/django/core/handlers/wsgi.py", line 1, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] from threading import Lock [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/opt/python2.6/lib/python2.6/threading.py", line 13, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] from functools import wraps [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/opt/python2.6/lib/python2.6/functools.py", line 10, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] from _functools import partial, reduce [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] SystemError: dynamic module not initialized properly And here is my .wsgi file import os import sys os.environ['PYTHON_EGG_CACHE'] = '/tmp/django/' os.environ['DJANGO_SETTINGS_MODULE'] = 'cube.settings' sys.path.append('/data/hosting/cubedev') import django.core.handlers.wsgi application = django.core.handlers.wsgi.WSGIHandler()

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    "Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack. Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while. Self-Service BI Self-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI. This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me: PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.) Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.) One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.) Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.) Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.) This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users. It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations. Collaborative BI I have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time. Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people." The Microsoft BI Stack in General A question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years. Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?" Expo Hall I had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here. Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions. Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind! Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack.Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while.Self-Service BISelf-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI.This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me:PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.)Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.)One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.)Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.)Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.)This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users.It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations.Collaborative BII have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time.Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people."The Microsoft BI Stack in GeneralA question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years.Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?"Expo HallI had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here.Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions.Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind!Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Upgrading Sharepoint MOSS 2007 Farm to Sharepoint 2010 "waiting to get a lock to upgrade the farm"

    - by Wes Weeks
    My first inplace upgrade of a MOSS 2007 farm to sharepoint went pretty smooth. I read the preupgrade documentation and was comfortable with the steps.  Since it was a fairly new installation of Moss changes were minimal and I wasn't anticipating too many problems The one issue I got was after installing the software on all of the farm.  I went to the first machine which ran Sharepoint 2010 central administration and ran the Sharepoint 2010 Products Configuration Wizard.  I received the message that I would need to run the configuration on each server in the farm.  Fair enough, I expected as much. The wizard completed without issue on the first server, but when I tried to run it on the others it hung with a "waiting to get a lock to upgrade the farm" message.  It hung for about 10 minutes and then the wizard failed.  Did a few searches on Google and Bing and got 0 results for that message.  None, Nothing, Zilch.  I'm on my own... For grins, hit the help button on the configuration wizard and it seemed to indicate that the configuration wizard needed to be run on all farm servers simultaneously.  I started it again on the first server to the point I got the message about needing to be run on all servers on the farm and then started the wizard on the other servers and ran it to that point as well.  I then clicked ok on the first server and then the subsuquent servers. It took a while and it did hang on the lock message for some time, but then it did kick off and completed succesfully on all of them.  Yeah! Hope this helps someone else!  Now there should be at least one post with this error message on it!

    Read the article

  • Sam Abraham to Speak about MVC2 at the Florida.Net Miramar .Net User Group on July 13 2010

    - by Sam Abraham
    I am scheduled to give a presentation at the Miramar .Net User Group on July 13, 2010 about MVC and the new features in MVC2. This will be similar yet will have more advanced content since the group had already had a introduction to MVC in a previous meeting. Here is the topic and speaker bio: Sam Abraham To Speak At The LI .Net User Group on June 3rd, 2010 As you might know, I lived and worked on LI, NY for 11 years before relocating to South Florida. As I will be visiting my family who still live there in the first week of June, I couldn't resist reaching out to Dan Galvez, LI  .Net User Group Leader, and asking if he needed a speaker for June's meeting. Apparently the stars were lined up right and I am now scheduled to speak at my "home" group on June 3rd, which I am pretty excited about. Here is a brief abstract of my talk and speaker bio. What's New in MVC2 We will start by briefly reviewing the basics of the Microsoft MVC Framework. Next, we will look at the new features introduced in the latest and greatest MVC2. Many new enhancements were introduced to both the MS MVC Framework and to VS2010 to improve developers' experience and reduce development time. We will be talking about new MVC2 features such as: Model Validation, Areas and Template Helpers. We will also discuss the new built-in MVC project templates that ship with VS2010. About the Speaker Sam Abraham is a Microsoft Certified Professional (MCP) and Microsoft Certified Technology Specialist (MCTS ASP.Net 3.5) He currently lives in South Florida where he leads the West Palm Beach .Net User Group (www.fladotnet.com) and actively participates in various local .Net Community events as organizer and/or technical speaker. Sam is also an active committee member on various initiatives at the South Florida Chapter of the Project Management Institute (www.southfloridapmi.org). Sam finds his passion in leveraging latest and greatest .Net Technologies along with proven Project Management practices and methodologies to produce high quality, cost-competitive software.  Sam can be reached through his blog: http://www.geekswithblogs.net/wildturtle

    Read the article

  • iSCSI connection timing out on ESX 3.5 against a NetApp storage appliance

    - by Jesse1973
    I get a timeout on iSCSI on different ESX hosts (3.5) at different times. It is puzzling, as both the ESX hosts as windows and other guests are experiencing timeouts. The iSCSI network is segregated on a private network. Here is an export of vmkiscsid.log from last night: 2010-02-10-12:30:38: iscsid: an InitiatorAlias= is required, but was not found in /etc/vmware/vmkiscsid/initiatorname.iscsi 2010-02-10-12:30:38: iscsid: LogLevel = 0 2010-02-10-12:30:38: iscsid: LogSync = 0 2010-02-10-12:30:42: iscsid: Login Success: iqn.1992-08.com.netapp:sn.101197719,default,192.168.73.2,3260,2001, 0x1 2010-02-10-12:30:42: iscsid: connection1:0 is operational now 2010-02-16-02:03:35: iscsid: Kernel reported iSCSI connection 1:0 error (1008) state (3) 2010-02-16-02:03:39: iscsid: connection1:0 is operational after recovery (2 attempts) 2010-02-16-04:02:27: iscsid: Kernel reported iSCSI connection 1:0 error (1008) state (3) 2010-02-16-04:02:32: iscsid: connection1:0 is operational after recovery (2 attempts) Should i edit the timeout value on the ESX host for iSCSI? This may work around the problem but will not solve it.

    Read the article

  • Can I speed up cygwin's fork?

    - by Andrew Aylett
    I came across a post discussing the speed of forking in Cygwin, giving an expected 'fork rate' in Windows XP of around 30-50 per-second (link) I've got a Core 2 duo (1.79GHz) which I would expect to get comparable results, but it's only managing around 8 forks per second (and sometimes a lot fewer): $ while (true); do date --utc; done | uniq -c 5 Wed Apr 21 12:38:10 UTC 2010 6 Wed Apr 21 12:38:11 UTC 2010 1 Wed Apr 21 12:38:12 UTC 2010 1 Wed Apr 21 12:38:13 UTC 2010 8 Wed Apr 21 12:38:14 UTC 2010 8 Wed Apr 21 12:38:15 UTC 2010 6 Wed Apr 21 12:38:16 UTC 2010 1 Wed Apr 21 12:38:18 UTC 2010 9 Wed Apr 21 12:38:19 UTC 2010 Can you suggest anything I might be able to do to speed things up? This machine acts a lot slower in Cygwin than others I've used before which actually were a lot slower.

    Read the article

  • Where did ULSTraceLog go to in the SharePoint 2010 Logging Database?

    - by Jan Tielens
    The Logging Database is one of the many new concepts that will make the life of many SharePoint administrators quite a bit more enjoyable. In SharePoint 2007 the Unified Logging System (ULS) logged all of its data to text files, typically found on your SharePoint server in 12\LOGS. We still have that in SharePoint 2010, but besides those text files, ULS can also write the data to a database! The advantages are obvious: easy to query, one central location for all servers in the farm, easy to build reports etc. You can find this ULS data in the SharePoint 2010 logging database (typically called WSS_Logging), in the view ULSTraceLog. Quite recently on one of my demo machines (standalone installation on Windows 7) I noticed the ULSTraceLog view was not available in the logging database. It turned out that there is a Timer Job that’s responsible for writing the data to the database, when the Timer Job hasn’t executed, the view is not there (the first time it executes, the view is created). Even more, the timer job was disabled, so the view would never be created, nor any data would be written to the database. If you encounter this situation as well, it’s quite easy to solve: Open the SharePoint Central Administration site Navigate to the Monitoring section Select Review Job Definitions Click on the job with the name Diagnostic Data Provider: Trace Log Click on the Enable button to enable it Optionally click on Run Now afterwards, to start it immediately There you go, the ULSTraceLog will be created and the ULS messages will appear in the database!

    Read the article

  • Crystal Reports 8 - Error 533 PESStartPrintJob

    - by Federico Giust
    We have on the company I work for an application built in Delphi V with Crystal Reports 8. We all know that Crystal & Delphi can be temperamental sometimes and the worst thing is, nearly no detail on the error. Also there is a big lack of documentation on the web about this. There's lot of people with similar issue but no solution. The error I'm talking about in particular is the one on the image below. This happens when trying to print any Crystal Reports on screen. This has happened sometimes in the clients computer and it was hard for us to replicate on our environment. Since it's an old version of Crystal it is very hard to find any helpful documentation to know the exact source of the problem.

    Read the article

  • How do I nstall MS Office 2010 via WINE?

    - by Emeris
    I am trying to install MS Office 2010 on Ubuntu 12.04 on my new MacBook Pro (15"). I already read and followed every existing threads on forums and followed every existing tutorial, but my problem seem unique so far, since whichever solution I try, the problem remains. When I launch PlayOnLinux, two boxes appear one after the other (before the latest upgrade of Ubuntu of last week, the second box did not appear, only the first one did); the first one tells me: Error: PlayOnLinux is unable to find 32-bits OpenGL libraries. You might encounter problem with your games." When I close this window, a second one pops up, stating: Error: PlayOnLinux cannot find 7z. You should install it to use PlayOnLinux. Of course, I tried purging PlayOnLinux (uninstalling it and re-installing it). I also tried other versions of PlayOnLinux. Nothing matters: the problem remains. I did not succeed so far to install 32-bits OpenGL libraries, since I have a Radeon graphics card (which seems to be unusual) and I just cannot find these libraries. Once the two "error" boxes are closed, PlayOnLinux is open, but does not seem to work properly; when I try to install Microsoft Office 2010, nothing happens. When I try to close PlayOnLinux, it is even worse: Unity seems unable to close it (I even had a frozen screen when trying to xkill it through the terminal). I am looking forward to any tips that could help. P.S.: 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Whistler [AMD Radeon HD 6600M Series]

    Read the article

  • Pair Programming: Pros and Cons

    - by O.D
    I need some experience reporting from the ones who have done pair programming, I noticed that lots of people recommend it but my experience was that at one point it's more efficient to sit alone, think and then write code than to talk with the other programmer (which can be very annoying to other programmers in the same office), do you agree to this? and if yes can you mention situations where pair programming is less efficient than traditional programming? Actually, I'm more interested in Cons than in Pros, but if it's your own experience I would like to read both, the Cons and the Pros. I would like to read what you think about the Programmer who doesn't have the keyboard, what can he do in the meanwhile other than talking about the concept? or checking the code on the screen?

    Read the article

  • Pair Programming: Pros and Cons

    - by O.D
    Hi I need some experience reporting from the ones who have done pair programming,i notice that lots of people recommend that but my experience was that at one point its more efficient to set alone, think and then write code than to talk with the other programmer (which can be very annoying to other programmers in the same office), do you agree to this? and if yes can you mention situations where pair programing is less efficient than traditional programing? Actually im more interested in Cons than in Pros, but if its your own experience i would like to read both, the Cons and the Pros. I would like to read what you think about the Programmer who does'nt have the keyboard, what can he do in the meanwhile other than talking about the concept? or checking the code on the screen? Thank you

    Read the article

  • Problems with my slotgame

    - by Raiden2k
    I'm coding a slot game for learning. Here's the source code. My questions are below. unit Unit1; {$mode objfpc}{$H+} interface uses Classes, SysUtils, Windows, FileUtil, Forms, Controls, Graphics, Dialogs, StdCtrls, ExtCtrls, ComCtrls, Menus, ActnList, Spin, FileCtrl; type { TForm1 } TForm1 = class(TForm) FloatSpinEdit1: TFloatSpinEdit; Guthabenlb: TLabel; s4: TLabel; s5: TLabel; s6: TLabel; s7: TLabel; s8: TLabel; s9: TLabel; Timer3: TTimer; Winlb: TLabel; Loselb: TLabel; slotbn: TButton; s1: TLabel; s2: TLabel; s3: TLabel; Timer1: TTimer; Timer2: TTimer; procedure FormCreate(Sender: TObject); procedure slotbnClick(Sender: TObject); procedure Timer1Timer(Sender: TObject); procedure Timer2Timer(Sender: TObject); procedure Timer3Timer(Sender: TObject); private { private declarations } FRollen : array [0..2, 0..9] of String; public { public declarations } end; var Form1: TForm1; wins,loses : Integer; guthaben : Double = 10; implementation {$R *.lfm} { TForm1 } procedure TForm1.slotbnClick(Sender: TObject); begin Guthaben := Guthaben - 1.00; Guthabenlb.Caption := FloatToStr(guthaben) + (' €'); Timer1.Enabled := True; Timer2.Enabled := True; slotbn.Enabled := false; end; procedure TForm1.FormCreate(Sender: TObject); var i: integer; j: integer; n: integer; digits: TStringlist; begin Digits := TStringList.Create; try for i := low(FRollen) to high(FRollen) do begin for j := low(FRollen[i]) to high(FRollen[i]) do Digits.Add(IntToStr(j)); for j := low(FRollen[i]) to high(FRollen[i]) do begin n := Random(Digits.Count); FRollen[i, j] := Digits[n]; Digits.Delete(n); end; end finally Digits.Free; end; for i:=low(FRollen) to high(FRollen) do begin end; end; //==================================================================================================\\ // Drehen der Slots im Zufallsmodus //==================================================================================================// procedure TForm1.Timer1Timer(Sender: TObject); begin s1.Caption := IntToStr(Random(9)); s2.Caption := IntToStr(Random(9)); s3.Caption := IntToStr(Random(9)); s4.Caption := IntToStr(Random(9)); s5.Caption := IntToStr(Random(9)); s6.Caption := IntToStr(Random(9)); s7.Caption := IntToStr(Random(9)); s8.Caption := IntToStr(Random(9)); s9.Caption := IntToStr(Random(9)); end; //==================================================================================================// //===================================================================================================\\ // Gewonnen / Verloren abfrage //===================================================================================================// procedure TForm1.Timer2Timer(Sender: TObject); begin Timer1.Enabled := False; Timer2.Enabled := false; if (s1.Caption = s5.Caption) and (s1.Caption = s9.Caption) then begin Guthaben := Guthaben + 5.00; Inc(wins); end else if (s1.Caption = s4.Caption) and (s1.Caption = s7.Caption) then begin Guthaben := Guthaben + 5.00; Inc(wins); end else if (s2.Caption = s5.Caption) and (s2.Caption = s8.Caption) then begin Guthaben := Guthaben + 5.00; Inc(wins); end else if (s3.Caption = s6.Caption) and (s3.Caption = s9.Caption) then begin Guthaben := Guthaben + 5.00; Inc(wins); end else if (s3.Caption = s5.Caption) and (s3.Caption = s7.Caption) then begin Guthaben := Guthaben + 5.00; Inc(wins); end else Inc(loses); slotbn.Enabled := True; Loselb.Caption := 'Loses: ' + IntToStr(loses); Winlb.Caption := 'Wins: ' + IntTostr(Wins); end; procedure TForm1.Timer3Timer(Sender: TObject); begin if (guthaben = 0) or (guthaben < 0) then begin Timer3.Enabled := False; MessageBox(handle,'Du hast verloren!','Verlierer!',MB_OK); close(); end; end; //======================================================================================================\\ end. How can I replace the labels through icons 16 x 16 pixels? How can I adjust the winning sum according to the icons? (for example 3 crowns give you 40 € and 3 apples only 10 €) How can I adjust the winning sum with a sum for every round?

    Read the article

  • Gems In The Visual Studio 2010 Training Kit &ndash; Introduction to MEF: Learning Labs

    - by Jim Duffy
    No, this post doesn’t have anything to do with cooking up illegal drugs in some rundown shack outside of town. That, my friends, would be a meth lab and fortunately that is waaaaay outside my area of expertise. Now I can talk Kentucky bourbon, or as Homer Simpson would say “mmmmmmmmmmm bourbon”, with you but please refrain from asking me meth questions. :-) Anyway, what I’m talking about are the MEF (Managed Extensibility Framework) Learning Labs contained in the Visual Studio 2010 and .NET Framework 4 Training Kit. Not sure what MEF is and need an overview? Then start here or here. Ok, so you’ve read a bit about MEF or heard about MEF and you’re thinking it might be something you and your development team might want to take a hands-on look at. I have good news then because contained in the Visual Studio 2010 and .NET Framework 4 Training Kit is a series of hands-on learning labs for MEF. I’ve added working my way through them to my “things I want to take a closer look at” list. Have a day. :-|

    Read the article

  • How to layout class definition when inheriting from multiple interfaces

    - by gabr
    Given two interface definitions ... IOmniWorkItem = interface ['{3CE2762F-B7A3-4490-BF22-2109C042EAD1}'] function GetData: TOmniValue; function GetResult: TOmniValue; function GetUniqueID: int64; procedure SetResult(const value: TOmniValue); // procedure Cancel; function DetachException: Exception; function FatalException: Exception; function IsCanceled: boolean; function IsExceptional: boolean; property Data: TOmniValue read GetData; property Result: TOmniValue read GetResult write SetResult; property UniqueID: int64 read GetUniqueID; end; IOmniWorkItemEx = interface ['{3B48D012-CF1C-4B47-A4A0-3072A9067A3E}'] function GetOnWorkItemDone: TOmniWorkItemDoneDelegate; function GetOnWorkItemDone_Asy: TOmniWorkItemDoneDelegate; procedure SetOnWorkItemDone(const Value: TOmniWorkItemDoneDelegate); procedure SetOnWorkItemDone_Asy(const Value: TOmniWorkItemDoneDelegate); // property OnWorkItemDone: TOmniWorkItemDoneDelegate read GetOnWorkItemDone write SetOnWorkItemDone; property OnWorkItemDone_Asy: TOmniWorkItemDoneDelegate read GetOnWorkItemDone_Asy write SetOnWorkItemDone_Asy; end; ... what are your ideas of laying out class declaration that inherits from both of them? My current idea (but I don't know if I'm happy with it): TOmniWorkItem = class(TInterfacedObject, IOmniWorkItem, IOmniWorkItemEx) strict private FData : TOmniValue; FOnWorkItemDone : TOmniWorkItemDoneDelegate; FOnWorkItemDone_Asy: TOmniWorkItemDoneDelegate; FResult : TOmniValue; FUniqueID : int64; strict protected procedure FreeException; protected //IOmniWorkItem function GetData: TOmniValue; function GetResult: TOmniValue; function GetUniqueID: int64; procedure SetResult(const value: TOmniValue); protected //IOmniWorkItemEx function GetOnWorkItemDone: TOmniWorkItemDoneDelegate; function GetOnWorkItemDone_Asy: TOmniWorkItemDoneDelegate; procedure SetOnWorkItemDone(const Value: TOmniWorkItemDoneDelegate); procedure SetOnWorkItemDone_Asy(const Value: TOmniWorkItemDoneDelegate); public constructor Create(const data: TOmniValue; uniqueID: int64); destructor Destroy; override; public //IOmniWorkItem procedure Cancel; function DetachException: Exception; function FatalException: Exception; function IsCanceled: boolean; function IsExceptional: boolean; property Data: TOmniValue read GetData; property Result: TOmniValue read GetResult write SetResult; property UniqueID: int64 read GetUniqueID; public //IOmniWorkItemEx property OnWorkItemDone: TOmniWorkItemDoneDelegate read GetOnWorkItemDone write SetOnWorkItemDone; property OnWorkItemDone_Asy: TOmniWorkItemDoneDelegate read GetOnWorkItemDone_Asy write SetOnWorkItemDone_Asy; end; As noted in answers, composition is a good approach for this example but I'm not sure it applies in all cases. Sometimes I'm using multiple inheritance just to split read and write access to some property into public (typically read-only) and private (typically write-only) part. Does composition still apply here? I'm not really sure as I would have to move the property in question out from the main class and I'm not sure that's the correct way to do it. Example: // public part of the interface interface IOmniWorkItemConfig = interface function OnExecute(const aTask: TOmniBackgroundWorkerDelegate): IOmniWorkItemConfig; function OnRequestDone(const aTask: TOmniWorkItemDoneDelegate): IOmniWorkItemConfig; function OnRequestDone_Asy(const aTask: TOmniWorkItemDoneDelegate): IOmniWorkItemConfig; end; // private part of the interface IOmniWorkItemConfigEx = interface ['{42CEC5CB-404F-4868-AE81-6A13AD7E3C6B}'] function GetOnExecute: TOmniBackgroundWorkerDelegate; function GetOnRequestDone: TOmniWorkItemDoneDelegate; function GetOnRequestDone_Asy: TOmniWorkItemDoneDelegate; end; // implementing class TOmniWorkItemConfig = class(TInterfacedObject, IOmniWorkItemConfig, IOmniWorkItemConfigEx) strict private FOnExecute : TOmniBackgroundWorkerDelegate; FOnRequestDone : TOmniWorkItemDoneDelegate; FOnRequestDone_Asy: TOmniWorkItemDoneDelegate; public constructor Create(defaults: IOmniWorkItemConfig = nil); public //IOmniWorkItemConfig function OnExecute(const aTask: TOmniBackgroundWorkerDelegate): IOmniWorkItemConfig; function OnRequestDone(const aTask: TOmniWorkItemDoneDelegate): IOmniWorkItemConfig; function OnRequestDone_Asy(const aTask: TOmniWorkItemDoneDelegate): IOmniWorkItemConfig; public //IOmniWorkItemConfigEx function GetOnExecute: TOmniBackgroundWorkerDelegate; function GetOnRequestDone: TOmniWorkItemDoneDelegate; function GetOnRequestDone_Asy: TOmniWorkItemDoneDelegate; end;

    Read the article

  • Finding the try for an except or finally [migrated]

    - by ?s?
    I'm dealing with some code that has fantastically long methods (10k lines!) and some odd use of try-finally and try-except blocks. Some of the latter are long by themselves, and don't always have the try at the start of the method. Obviously I'm trying to refactor the code, but in the meantime just being able to fix a couple of common pathologies would be much easier if I could jump to the start of a block and see what is happening there. When it's 20+ pages away finding it even with the CNPack rainbows is just tedious. I'm using D2010 and have GExperts (with DelForExp), CNPack and DDevExtensions installed, but I can't find anything that lets me jump from the try to the finally or back. Am I missing something? Is there another add-in that I can use that will get me this?

    Read the article

  • Does anyone know how to appropriately deal with user timezones in rails 2.3?

    - by Amazing Jay
    We're building a rails app that needs to display dates (and more importantly, calculate them) in multiple timezones. Can anyone point me towards how to work with user timezones in rails 2.3(.5 or .8) The most inclusive article I've seen detailing how user time zones are supposed to work is here: http://wiki.rubyonrails.org/howtos/time-zones... although it is unclear when this was written or for what version of rails. Specifically it states that: "Time.zone - The time zone that is actually used for display purposes. This may be set manually to override config.time_zone on a per-request basis." Keys terms being "display purposes" and "per-request basis". Locally on my machine, this is true. However on production, neither are true. Setting Time.zone persists past the end of the request (to all subsequent requests) and also affects the way AR saves to the DB (basically treating any date as if it were already in UTC even when its not), thus saving completely inappropriate values. We run Ruby Enterprise Edition on production with passenger. If this is my problem, do we need to switch to JRuby or something else? To illustrate the problem I put the following actions in my ApplicationController right now: def test p_time = Time.now.utc s_time = Time.utc(p_time.year, p_time.month, p_time.day, p_time.hour) logger.error "TIME.ZONE" + Time.zone.inspect logger.error ENV['TZ'].inspect logger.error p_time.inspect logger.error s_time.inspect jl = JunkLead.create! jl.date_at = s_time logger.error s_time.inspect logger.error jl.date_at.inspect jl.save! logger.error s_time.inspect logger.error jl.date_at.inspect render :nothing => true, :status => 200 end def test2 Time.zone = 'Mountain Time (US & Canada)' logger.error "TIME.ZONE" + Time.zone.inspect logger.error ENV['TZ'].inspect render :nothing => true, :status => 200 end def test3 Time.zone = 'UTC' logger.error "TIME.ZONE" + Time.zone.inspect logger.error ENV['TZ'].inspect render :nothing => true, :status => 200 end and they yield the following: Processing ApplicationController#test (for 98.202.196.203 at 2010-12-24 22:15:50) [GET] TIME.ZONE#<ActiveSupport::TimeZone:0x2c57a68 @tzinfo=#<TZInfo::DataTimezone: Etc/UTC>, @name="UTC", @utc_offset=0> nil Fri Dec 24 22:15:50 UTC 2010 Fri Dec 24 22:00:00 UTC 2010 Fri Dec 24 22:00:00 UTC 2010 Fri, 24 Dec 2010 22:00:00 UTC +00:00 Fri Dec 24 22:00:00 UTC 2010 Fri, 24 Dec 2010 22:00:00 UTC +00:00 Completed in 21ms (View: 0, DB: 4) | 200 OK [http://www.dealsthatmatter.com/test] Processing ApplicationController#test2 (for 98.202.196.203 at 2010-12-24 22:15:53) [GET] TIME.ZONE#<ActiveSupport::TimeZone:0x2c580a8 @tzinfo=#<TZInfo::DataTimezone: America/Denver>, @name="Mountain Time (US & Canada)", @utc_offset=-25200> nil Completed in 143ms (View: 1, DB: 3) | 200 OK [http://www.dealsthatmatter.com/test2] Processing ApplicationController#test (for 98.202.196.203 at 2010-12-24 22:15:59) [GET] TIME.ZONE#<ActiveSupport::TimeZone:0x2c580a8 @tzinfo=#<TZInfo::DataTimezone: America/Denver>, @name="Mountain Time (US & Canada)", @utc_offset=-25200> nil Fri Dec 24 22:15:59 UTC 2010 Fri Dec 24 22:00:00 UTC 2010 Fri Dec 24 22:00:00 UTC 2010 Fri, 24 Dec 2010 15:00:00 MST -07:00 Fri Dec 24 22:00:00 UTC 2010 Fri, 24 Dec 2010 15:00:00 MST -07:00 Completed in 20ms (View: 0, DB: 4) | 200 OK [http://www.dealsthatmatter.com/test] Processing ApplicationController#test3 (for 98.202.196.203 at 2010-12-24 22:16:03) [GET] TIME.ZONE#<ActiveSupport::TimeZone:0x2c57a68 @tzinfo=#<TZInfo::DataTimezone: Etc/UTC>, @name="UTC", @utc_offset=0> nil Completed in 17ms (View: 0, DB: 2) | 200 OK [http://www.dealsthatmatter.com/test3] Processing ApplicationController#test (for 98.202.196.203 at 2010-12-24 22:16:04) [GET] TIME.ZONE#<ActiveSupport::TimeZone:0x2c57a68 @tzinfo=#<TZInfo::DataTimezone: Etc/UTC>, @name="UTC", @utc_offset=0> nil Fri Dec 24 22:16:05 UTC 2010 Fri Dec 24 22:00:00 UTC 2010 Fri Dec 24 22:00:00 UTC 2010 Fri, 24 Dec 2010 22:00:00 UTC +00:00 Fri Dec 24 22:00:00 UTC 2010 Fri, 24 Dec 2010 22:00:00 UTC +00:00 Completed in 151ms (View: 0, DB: 4) | 200 OK [http://www.dealsthatmatter.com/test] It should be clear above that the 2nd call to /test shows Time.zone set to Mountain, even though it shouldn't. Additionally, checking the database reveals that the test action when run after test2 saved a JunkLead record with a date of 2010-12-22 15:00:00, which is clearly wrong.

    Read the article

  • App losing db connection

    - by DaveKub
    I'm having a weird issue with an old Delphi app losing it's database connection. Actually, I think it's losing something else that then makes the connection either drop or be unusable. The app is written in Delphi 6 and uses the Direct Oracle Access component (v4.0.7.1) to connect to an Oracle 9i database. The app runs as a service and periodically queries the db using a TOracleQuery object (qryAlarmList). The method that is called to do this looks like this: procedure TdmMain.RefreshAlarmList; begin try qryAlarmList.Execute; except on E: Exception do begin FStatus := ssError; EventLog.LogError(-1, 'TdmMain.RefreshAlarmList', 'Message: ' + E.Message); end; end; end; It had been running fine for years, until a couple of Perl scripts were added to this machine. These scripts run every 15 minutes and look for datafiles to import into the db, and then they do a some calculations and a bunch of reads/writes to/from the db. For some reason, when they are processing large amounts of data, and then the Delphi app tries to query the db, the Delphi app throws an exception at the "qryAlarmList.Execute" line in the above code listing. The exception is always: Access violation at address 00000000. read of address 00000000 HOW can something that the Perl scripts are doing cause this?? There are other Perl scripts on this machine that load data using the same modules and method calls and we didn't have problems. To make it even weirder, there are two other apps that will also suddenly lose their ability to talk to the database at the same time as the Perl stuff is running. Neither of those apps run on this machine, but both are Delphi 6 apps that use the same DOA component to connect to the same database. We have other apps that connect to the same db, written in Java or C# and they don't seem to have any problems. I've tried adding code before the '.Execute' method is called to: check the session's connection (session.CheckConnection(true); always comes back as 'ccOK'). see whether I can access a field of the qryAlarmList object to see if maybe it's become null; can access it fine. check the state of the qryAlarmList; always says it's qsIdle. Does anyone have any suggestions of something to try? This is driving me nuts! Dave

    Read the article

  • Is there any better IDOMImplementation other than MSXML?

    - by Chau Chee Yang
    There are 3 IDOMImplementation available in Delphi: MSXML Xerces XML ADOM XML v4 MSXML is the default IDOMImplementation. My test is count the time need to load a 10MB xml file. I use a Delphi unit generated from a XSD using XML data binding to load the xml file. This unit has 3 common function: function Getmenubar(Doc: IXMLDocument): IXMLMenubarType; function Loadmenubar(const FileName: WideString): IXMLMenubarType; function Newmenubar: IXMLMenubarType; I learn from the web that some comment that MSXML's overhead is high that it doesn't perform if compare to other XML parser. However, my study shows that MSXML is the best among others. Xerces XML 2nd and ADOM XML v4 the worst: MSXML - 0.6410 seconds Xerces XML - 2.4220 seconds ADOM XML v4 - 67.50 seconds I also come across with OmniXML that claim to have much better performance compare to MSXML but I never success using it with the unit generated by XML data binding. Is there any other vendor that implement IDOMImplementation of Delphi that work much better than MSXML? I am using Delphi 2010 and Windows 7.

    Read the article

  • Cannot open files in Visual Studio but in Delphi and Notepad

    - by Andrew J. Brehm
    About an hour ago Visual Studio 2008 decided that it cannot find files any more. This is on 64 bit Windows Vista. When I right-click on a text file (source code or otherwise) and select "open with" and "Visual Studio 2008", I get the following error (example): Windows cannot find 'C:\Users\ajbrehm\Documents\Visual Studio 2008\Projects\Hello Prism\Hello Prism\Main.pas'. Make sure you typed the name correctly, and then try again. When I right-click the same file and select "open with" and "Delphi 2010" or "Notepad" (both other options available for text files on my system), the file opens correctly. Oddly enough when the file is part of a Visual Studio project and I open the project itself with Visual Studio (this works), I can open the file from within Visual Studio. Any ideas what might be going on? This started about an hour after I made a complete backup of my Vista VM and after I installed IIS 7, SQL Express, and Sourcegear Vault. The first files I noticed couldn't be opened in Visual Studio any more where Pascal source files in checked-outed folders from Vault. And Vault also seems to be unable to see one of the sources files and claims they don't exist. I found out about Visual Studio not opening ANY files any more when I tried to recreate the file Vault refused to see. Update: I just checked. Another user, "administrator", can still open text files with Visual Studio 2008. Both users have administrator rights. Update: I just restored the hours-old backup. Same problem. Apparently whatever triggered this happened before the install of IIS 7 and SQL Express. Never noticed it before.

    Read the article

  • Cannot open files in Visual Studio but in Delphi and Notepad

    - by Andrew J. Brehm
    About an hour ago Visual Studio 2008 decided that it cannot find files any more. This is on 64 bit Windows Vista. When I right-click on a text file (source code or otherwise) and select "open with" and "Visual Studio 2008", I get the following error (example): Windows cannot find 'C:\Users\ajbrehm\Documents\Visual Studio 2008\Projects\Hello Prism\Hello Prism\Main.pas'. Make sure you typed the name correctly, and then try again. When I right-click the same file and select "open with" and "Delphi 2010" or "Notepad" (both other options available for text files on my system), the file opens correctly. Oddly enough when the file is part of a Visual Studio project and I open the project itself with Visual Studio (this works), I can open the file from within Visual Studio. Any ideas what might be going on? This started about an hour after I made a complete backup of my Vista VM and after I installed IIS 7, SQL Express, and Sourcegear Vault. The first files I noticed couldn't be opened in Visual Studio any more where Pascal source files in checked-outed folders from Vault. And Vault also seems to be unable to see one of the sources files and claims they don't exist. I found out about Visual Studio not opening ANY files any more when I tried to recreate the file Vault refused to see. Update: I just checked. Another user, "administrator", can still open text files with Visual Studio 2008. Both users have administrator rights. Update: I just restored the hours-old backup. Same problem. Apparently whatever triggered this happened before the install of IIS 7 and SQL Express. Never noticed it before.

    Read the article

< Previous Page | 236 237 238 239 240 241 242 243 244 245 246 247  | Next Page >